Harm to individuals through fake content
Using AI systems to gain a personal advantage over others such as through cheating, fraud, scams, blackmail or targeted manipulation of beliefs or behavior. Examples include AI-facilitated plagiarism for research or education, impersonating a trusted or fake individual for illegitimate financial benefit, or creating humiliating or sexual imagery.
"Malicious actors can use general- purpose AI to generate fake content that harms individuals in a targeted way. For example, they can use such fake content for scams, extortion, psychological manipulation, generation of non- consensual intimate imagery (NCII) and child sexual abuse material (CSAM), or targeted sabotage of individuals and organisations."(p. 62)
Supporting Evidence (2)
"Malicious actors can misuse AI- generated fake content to extort, scam, psychologically manipulate, or sabotage targeted individuals or organisations (see Table 2.1) (271). This threatens universal human rights, for example the right against attacks upon one’s honour and reputation (272). This section focuses on harms caused to individuals through AI- generated fake content. Potential impacts of AI- generated and - mediated influence campaigns on the societal level are covered in 2.1.2. Manipulation of public opinion."(p. 62)
"Scams / fraud Using AI to generate content such as an audio clip impersonating a victim’s voice in order to, for example, authorise a financial transaction. Blackmail / extortion Generating fake content of an individual, such as intimate images, without their consent and threatening to release them unless financial demands are met. Sabotage Generating fake content that presents an individual engaging in compromising activities, such as sexual activity or using drugs, and then releasing that content in order to erode a person’s reputation, harm their career, and/or force them to disengage from public- facing activities (e.g. in politics, journalism, or entertainment). Psychological abuse / bullying Generating harmful representations of an individual for the primary purpose of abusing them and causing them psychological trauma. Victims are often children."(p. 63)
Part of Risks from malicious use
Other risks from Bengio2025 (13)
Risks from malicious use
4.0 Malicious Actors & MisuseRisks from malicious use > Manipulation of public opinion
4.1 Disinformation, surveillance, and influence at scaleRisks from malicious use > Cyber offence
4.2 Cyberattacks, weapon development or use, and mass harmRisks from malicious use > Biological and chemical attacks
4.2 Cyberattacks, weapon development or use, and mass harmReliability issues
7.3 Lack of capability or robustnessBias
1.1 Unfair discrimination and misrepresentation