Paedophiles are using AI 'nudifying' tools to create nude images of children from underwear photos to blackmail them into sending more explicit content, according to a manual found on the dark web by the Internet Watch Foundation.
The Internet Watch Foundation (IWF) discovered a nearly 200-page manual on the dark web that instructs paedophiles to use artificial intelligence 'nudifying' tools to remove clothing from underwear shots sent by children. The manipulated images are then used to blackmail children into sending more graphic content. The anonymous author of the manual claims to have 'successfully blackmailed' 13-year-old girls into sending nude imagery online. This represents the first evidence the IWF has seen of perpetrators advising each other to use AI technology for child sexual abuse. The document was passed to the UK's National Crime Agency. The IWF reported that 2023 was 'the most extreme year on record' with over 275,000 webpages containing child sexual abuse material found, including a record 62,000 pages of category A content (the most severe imagery). The organization also found 2,401 images of self-generated child sexual abuse material involving children aged three to six years old, where victims were manipulated into recording abuse of themselves in domestic settings.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Using AI systems to gain a personal advantage over others such as through cheating, fraud, scams, blackmail or targeted manipulation of beliefs or behavior. Examples include AI-facilitated plagiarism for research or education, impersonating a trusted or fake individual for illegitimate financial benefit, or creating humiliating or sexual imagery.
AI system
Due to a decision or action made by an AI system
Intentional
Due to an expected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed