Hugging Face hosted malicious software masquerading as OpenAI release

Hugging Face hosted malicious software masquerading as OpenAI release

A malicious Hugging Face repository that posed as an OpenAI launch delivered infostealer malware to Home windows machines and recorded about 244,000 downloads earlier than removing, based on analysis from AI safety agency HiddenLayer. The variety of downloads could have been artificially inflated by the attackers to make the mannequin appear extra common, so the extent of the results of the assault is unknown.

‘Open-OSS/privacy-filter’ imitated OpenAI’s Privateness Filter launch. HiddenLayer stated the unique mannequin card had been copied practically precisely, and the dangerous actors included a malicious loader.py file that fetched and ran credential-stealing malware on Home windows hosts.

The repos reached the highest of the ‘trending’ listing on Hugging Face with 667 likes accrued in lower than 18 hours – once more, this determine could have been modified by the attackers.

Public AI mannequin registries could also be changing into dangers within the software program provide chain as builders and knowledge scientists clone fashions straight into company environments, environments which have entry to supply code, cloud credentials, and inside programs. That scenario alone makes a compromised mannequin repository greater than a nuisance.

The README file for the pretend mannequin intently resembled that of the respectable venture, but it surely departed from the unique in that it instructed customers to run begin.bat on Home windows or execute python loader.py on Linux and macOS, directions central to the an infection chain HiddenLayer described.

Researchers have beforehand warned that malicious code will be hidden inside AI mannequin information or associated setup scripts on Hugging Face and different public registries. Earlier instances concerned Pickle-serialised mannequin information that bypassed platform scanners.

Malicious loader disguised as setup code

HiddenLayer stated loader.py started with decoy code that resembled a standard AI mannequin loader, shifting rapidly to a hid an infection chain. A script disabled SSL verification, decoded a base64-encoded URL linked to jsonkeeper.com, retrieved a distant payload instruction, and handed instructions to PowerShell on Home windows machines. HiddenLayer stated the usage of the command-and-control channel jsonkeeper.com allowed the attacker to rotate the payload with out altering the repo’s contents.

The PowerShell command then downloaded an extra batch file from an attacker-controlled area, and the malware established persistence by making a scheduled job designed to resemble a respectable Microsoft Edge replace course of.

The ultimate payload was a Rust-based infostealer. Based on HiddenLayer, it focused Chromium and Firefox-derived browsers, Discord native storage, cryptocurrency wallets, FileZilla configurations, and host system data. The malware additionally tried to disable Home windows Antimalware Scan Interface and Occasion Tracing.

Wider campaigns

HiddenLayer additionally stated it discovered six additional Hugging Face repositories containing just about equivalent loader logic that shared infrastructure with the cited assault.

The case follows different warnings about malicious AI fashions on Hugging Face, together with poisoned AI SDKs and pretend OpenClaw installers. The widespread thread is that attackers are treating AI improvement workflows as a route into usually safe environments. AI repositories usually include executable code, setup directions, dependency information, notebooks, and scripts, and its these peripheral components that trigger the issues, relatively than the fashions themselves.

Sakshi Grover, senior analysis supervisor for cybersecurity companies at IDC, stated conventional SCA was designed to examine dependency manifests, libraries, and container pictures. It’s much less efficient at figuring out malicious loader logic in AI repositories. Additionally they cited IDC’s November 2025 FutureScape report, which contained the decision that by 2027, 60% of agentic AI programs ought to have a invoice of supplies. This may assist corporations observe which AI artefacts they use, their supply, which variations had been accepted, and whether or not they include executable elements.

Response and mitigation

HiddenLayer suggested anybody who cloned Open-OSS/privacy-filter and ran begin.bat, python loader.py or any file from the repository on a Home windows host to deal with the system as compromised, and recommends re-imaging programs. Browser classes ought to thought of compromised even when passwords will not be held domestically, as session cookies let attackers bypass MFA in some circumstances.

Hugging Face has confirmed the repo has been eliminated.

(Picture supply: Pixabay, beneath licence.)

 

Need to be taught extra about AI and large knowledge from business leaders? Try AI & Big Data Expo going down in Amsterdam, California, and London. The great occasion is a part of TechEx and co-located with different main know-how occasions. Click on here for extra data.

AI Information is powered by TechForge Media. Discover different upcoming enterprise know-how occasions and webinars here.