Google has taken down more than 3,000 videos linked to the so‑called YouTube Ghost Network, a sprawling malware distribution operation documented by Check Point. Researchers say the activity has been ongoing since 2021 and surged sharply in 2025, effectively tripling in volume. The core objective: to push the Rhadamanthys and Lumma infostealers by disguising them as cracked software and cheats for popular games.
How the YouTube Ghost Network operated: account takeovers and manufactured trust
According to Check Point, the operators combined account takeover of legitimate YouTube channels with large-scale creation of fraudulent profiles. One cluster posted tutorial-style videos embedding malicious links, a second cluster artificially inflated views, likes, and comments, and a third leveraged YouTube’s Community posts to seed URLs. By distributing roles, the network fabricated “trust signals” that made content appear both popular and safe, lowering users’ defenses.
Delivery vectors and lures: from Roblox cheats to pirated Photoshop
The campaign relied on high-demand lures: cracked versions of Photoshop, FL Studio, Microsoft Office, and Lightroom, plus cheats for Roblox. Many videos instructed viewers to temporarily disable antivirus and download archives from Dropbox, Google Drive, or MediaFire. Instead of the promised tools, victims received Rhadamanthys or Lumma—infostealers designed to exfiltrate browser credentials, cookies, and crypto wallet data. These payloads are frequently packed or multi-staged to evade detection.
Scale, examples, and adaptive infrastructure
Researchers observed a compromised channel with 129,000 subscribers posting a “pirated” Adobe Photoshop video that amassed nearly 300,000 views and over 1,000 likes. Another thread of activity targeted crypto users and redirected traffic to phishing pages hosted on Google Sites. Operators rotated domains, payloads, and links to stay ahead of moderation and sustain delivery, a technique that reduces exposure windows and complicates takedown efforts.
Modular architecture and parallels with GitHub campaigns
The Ghost Network exhibited a modular design: distinct components handled payload hosting and updates, engagement manipulation, and link dissemination. This compartmentalization made the operation resilient—easy to reconstitute after account bans or URL blocklists. Check Point notes similarities to the Stargazers Ghost Network on GitHub, where thousands of bogus developer accounts hosted malicious repositories to exploit trust in a widely used platform.
Attribution, motivation, and potential for targeted abuse
Check Point did not attribute the campaign to a specific threat actor. The most plausible motive is financial—monetizing stolen credentials and drained wallets on underground markets. However, researchers caution that the same tradecraft—credible-looking content on trusted platforms, rapid infrastructure rotation, and modular tooling—could be repurposed by state-aligned groups for targeted operations.
Why this social engineering works on trusted platforms
Users often view video platforms and cloud storage as inherently safe, and engagement metrics such as views, likes, and comments act as “social proof.” Those heuristics, while convenient, are easily gamed at scale. Platform moderation is further challenged by fast link rotation, account churn, and the use of legitimate services (e.g., Google Sites, cloud drives), which complicates automated detection.
Actionable security recommendations for users and organizations
Do not disable security controls: any request to turn off antivirus is a red flag. Avoid pirated software and cheats, a consistent vector for infostealers. Verify the source: inspect channel names, publishing history, and comments; follow official vendor channels for downloads and updates.
Enable a password manager and multi-factor authentication (MFA). Regularly audit third-party access in your Google account and revoke suspicious OAuth tokens. In corporate environments, consider blocking executable or archive downloads from public cloud storage, enforce DNS/URL filtering, deploy EDR/antimalware with behavioral detection, implement application allowlisting, and train employees to spot social engineering on video and social platforms.
The YouTube Ghost Network underscores that even trusted ecosystems can be weaponized to spread malware. A viral tutorial can be as hazardous as a traditional phishing email. Practicing digital hygiene, steering clear of pirated content, and validating sources are decisive steps to reduce the risk of credential theft and financial loss. Security teams should update controls accordingly and report suspicious channels and links to accelerate takedowns.