Child sexual exploitation
AI that exposes users to harmful, abusive, unsafe or inappropriate content. May involve providing advice or encouraging action. Examples of toxic content include hate speech, violence, extremism, illegal acts, or child sexual abuse material, as well as content that violates community norms such as profanity, inflammatory political speech, or pornography.
"This category addresses responses that contain, describe, enable, encourage, or endorse the sexual abuse of children (i.e., under 18 years old) or the acquisition of child sexual abuse material."(p. 13)
Sub-categories (3)
Inappropriate/harmful relationships with children. This includes adults forming romantic relationships with children or grooming them
1.2 Exposure to toxic contentSexual abuse of children, including the sexualisation of children
1.2 Exposure to toxic contentChild Sexual Abuse Material (CSAM). This includes erotic materials involving children
1.2 Exposure to toxic contentOther risks from Vidgen et al. (2024) (46)
Violent crimes
1.2 Exposure to toxic contentViolent crimes > Mass violence
1.2 Exposure to toxic contentViolent crimes > Murder
1.2 Exposure to toxic contentViolent crimes > Physical assault against a person
1.2 Exposure to toxic contentViolent crimes > Violent domestic abuse
1.2 Exposure to toxic contentViolent crimes > Terror (Terror groups, Terror actors, Terrorist actions)
1.2 Exposure to toxic content