Cybercriminals created fake AI video generation platforms advertised on Facebook to distribute Noodlophile Stealer malware, tricking users into downloading malicious files instead of AI-generated content.
Cybercriminals developed a malware campaign using fake AI video generation platforms that mimic legitimate services like Luma Dream Machine and CapCut. These fraudulent websites were promoted through Facebook groups with thousands of followers, including one verified page with nearly 4,000 followers and posts gaining over 62,000 views. Users were instructed to upload images for AI video processing, but instead of receiving generated content, they downloaded ZIP archives containing malware. The malicious payload included a previously undocumented infostealer called Noodlophile Stealer, which harvests browser credentials, cryptocurrency wallets, and sensitive data before exfiltrating it via Telegram bots. The malware often bundled XWorm, a remote access trojan, for deeper system control. The attack chain involved multiple stages of obfuscation, including Base64-encoded archives, Python payloads, and techniques like PE hollowing to evade detection. The campaign was linked to Vietnamese-speaking operators offering the malware as a service on dark web forums.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Using AI systems to gain a personal advantage over others such as through cheating, fraud, scams, blackmail or targeted manipulation of beliefs or behavior. Examples include AI-facilitated plagiarism for research or education, impersonating a trusted or fake individual for illegitimate financial benefit, or creating humiliating or sexual imagery.
Human
Due to a decision or action made by humans
Intentional
Due to an expected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed