YouTube's automated content moderation system incorrectly removed a Women of Sex Tech conference livestream within four minutes, forcing organizers to move to a different platform.
YouTube's automated moderation system removed a livestream of the Women of Sex Tech conference just four minutes into a test broadcast. The conference, which had been running for five years but moved online due to the coronavirus pandemic, contained no sexually gratifying content, nudity, or pornography that would violate YouTube's community guidelines. YouTube acknowledged the error was caused by increased reliance on automated algorithms instead of human moderators due to pandemic-related constraints. The platform's spokesperson stated that while this may result in some videos being removed that do not violate policies, it allows them to continue acting quickly to protect their ecosystem. Conference president Alison Falk reported being confused by the removal since there was no mention of sex or adult content at the time of removal. The five-hour conference ultimately proceeded on Saturday after organizers paid a couple hundred dollars to host the event on Crowdcast instead. YouTube offered an appeals process for creators who believe their content was removed in error.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that fail to perform reliably or effectively under varying conditions, exposing them to errors and failures that can have significant consequences, especially in critical applications or areas that require moral reasoning.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed