Generative artificial intelligence has transitioned from a creative novelty into a structured, monetized engine for digital identity theft. A newly filed lawsuit in Arizona alleges that these men allegedly profit off teaching people how to make AI porn by turning the creation of non-consensual imagery into a scalable business model. By leveraging the faces and features of unsuspecting women, these individuals are reportedly teaching others how to manufacture "AI influencers" designed to exploit real-world likenesses for profit.
The Architecture of Digital Predation
The plaintiffs in the lawsuit—including one woman identified as MG—claim that defendants Jackson Webb, Lucas Webb, and Beau Schultz scoured social media platforms to harvest images of women with relatively modest followings. Using software such as CreatorCore, the group allegedly trained AI models on these stolen photos to produce sexually explicit content.
This process involves more than simple face-swapping; it is a systematic attempt to create digital clones that appear indistinguishably similar to real people. The lawsuit alleges that the defendants even provided instructions on how to select targets, specifically suggesting women with smaller followings to avoid the legal scrutiny that comes with targeting high-profile celebrities.
The psychological impact on those targeted is profound. For victims like MG, discovering that her face and tattoos had been superimposed onto explicit bodies served as a "reality check" regarding the loss of control over her own image. It is a disturbing example of how these men allegedly profit off teaching people how to make AI porn at the expense of personal autonomy.
How These Men Allegedly Profit Off Teaching People How to Make AI Porn
This operation extends far beyond the creation of isolated images, functioning instead as a comprehensive educational enterprise. Through platforms like Whop and Telegram, the defendants allegedly sell "blueprints" and tutorials for a monthly subscription fee, marketing the practice as a lucrative "side hustle."
The mechanics of this business model include:
- Using specialized apps to strip clothing from existing photos of real women.
- Scouring Instagram and TikTok to scrape high-quality facial data.
- Utilizing platforms like Fanvue to monetize the resulting AI-generated "models."
- Training a massive subscriber base, which reportedly grew to over 8,000 members.
The scale of this exploitation is staggering. According to the complaint, the AI ModelForge ecosystem has resulted in the generation of more than 500,000 AI-altered images and videos. The way these men allegedly profit off teaching people how to make AI porn involves marketing that often features blatant displays of wealth—such as luxury cars and expensive watches—to boast about the revenue generated from a single "built" influencer.
A Regulatory Vacuum and the Enforcement Gap
Despite the growing visibility of these practices, legislative protections remain dangerously behind the curve. While the federal Take It Down Act was signed into law to criminalize the publication of non-consensual sexualized AI content, its full enforcement is not slated until May 2026. In the interim, victims find themselves caught in a "whack-a-mole" battle with major social media platforms.
Current enforcement struggles are compounded by the technical nature of the content. Because the AI-generated images are often distinct enough to avoid strict impersonation bans on Instagram, many of these accounts remain active even after being flagged for misconduct. While TikTok has reportedly taken down some associated accounts, the sheer volume of generated content makes manual moderation nearly impossible.
The rise of entities like AI ModelForge marks a grim milestone in the erosion of digital consent. As generative tools become more sophisticated, the ability to commodify a person's likeness without their permission becomes both easier and more profitable. Without proactive detection tools and immediate legal recourse, the internet is rapidly becoming a landscape where no one’s public image is truly their own.