AI-powered apps used by loan sharks in India created non-consensual deepfake nude images of borrowers' family members to blackmail them into repaying debts, causing severe emotional distress and privacy violations.
Loan shark companies in India used instant personal loan apps that gained access to borrowers' phone galleries upon download. When borrowers like Rishabh defaulted on payments for loans as small as Rs 30,000, the companies used AI-powered 'nudify' tools to create realistic fake nude images of borrowers' family members, particularly women. In Rishabh's case, his wife Shefali received a deepfake nude image of herself that looked impossibly real, complete with her actual jewelry. The fake image was then sent to multiple friends and family members as blackmail. These AI tools use generative AI models trained on extensive data of female bodies and employ inpainting techniques to replace clothed areas with nude imagery matching skin tone and lighting. The tools are widely available on platforms like Telegram, with hundreds of channels offering 'undressing' services for fees ranging from Rs 199 to Rs 19,999. The report also mentions similar incidents involving a Bengaluru teenager who created AI-generated nudes of a ninth-grade classmate, and celebrities like Rashmika Mandanna and Taylor Swift becoming victims of similar deepfake abuse.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Using AI systems to gain a personal advantage over others such as through cheating, fraud, scams, blackmail or targeted manipulation of beliefs or behavior. Examples include AI-facilitated plagiarism for research or education, impersonating a trusted or fake individual for illegitimate financial benefit, or creating humiliating or sexual imagery.
Human
Due to a decision or action made by humans
Intentional
Due to an expected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed