Skip to main content
BackChild Sexual Abuse Material (CSAM). This includes erotic materials involving children
Home/Risks/Vidgen et al. (2024)/Child Sexual Abuse Material (CSAM). This includes erotic materials involving children

Child Sexual Abuse Material (CSAM). This includes erotic materials involving children

Introducing v0.5 of the AI Safety Benchmark from MLCommons

Vidgen et al. (2024)

Sub-category
Risk Domain

AI that exposes users to harmful, abusive, unsafe or inappropriate content. May involve providing advice or encouraging action. Examples of toxic content include hate speech, violence, extremism, illegal acts, or child sexual abuse material, as well as content that violates community norms such as profanity, inflammatory political speech, or pornography.

Part of Child sexual exploitation

Other risks from Vidgen et al. (2024) (46)