BackSexualization
Sexualization
Risk Domain
Using AI systems to gain a personal advantage over others such as through cheating, fraud, scams, blackmail or targeted manipulation of beliefs or behavior. Examples include AI-facilitated plagiarism for research or education, impersonating a trusted or fake individual for illegitimate financial benefit, or creating humiliating or sexual imagery.
"The non-consensual sexualisation of an individual or group using a technology or application"
Entity— Who or what caused the harm
Intent— Whether the harm was intentional or accidental
Timing— Whether the risk is pre- or post-deployment
Other risks from Li et al. (2025) (40)
Autonomy
5.2 Loss of human agency and autonomyOtherOtherOther
Autonomy > Impersonation / identity theft
4.3 Fraud, scams, and targeted manipulationHumanIntentionalPost-deployment
Misinformation Harms
3.1 False or misleading informationAI systemOtherPost-deployment
Representation and Toxicity
1.0 Discrimination & ToxicityAI systemUnintentionalPost-deployment
IP / copyright / personality / rights loss
4.3 Fraud, scams, and targeted manipulationHumanIntentionalPost-deployment
Autonomy / agency loss
5.2 Loss of human agency and autonomyOtherOtherOther