Facebook's AI content moderation system incorrectly flagged an advertisement for Walla Walla onion seeds as 'overtly sexual' content, temporarily removing the ad before human review restored it.
In October 2020, The Seed Company by E.W. Gaze, a garden center in Newfoundland, Canada, had their Facebook advertisement for Walla Walla onion seeds removed by Facebook's automated content moderation system. The ad contained a photo of onions in a wicker basket. Facebook's AI system flagged the image as violating rules on 'products with overtly sexual positioning' and stated that 'listings may not position products or services in a sexually suggestive manner.' Jackson McLean, a manager at the company, speculated that the two round shapes of the onions could have been misconstrued as resembling human body parts. The company requested a review of the ban but initially received no reply. Facebook Canada's head of communications, Meg Sinclair, later acknowledged that their automated technology made the error, explaining 'We use automated technology to keep nudity off our apps. But sometimes it doesn't know a walla walla onion from a, well, you know.' The ad was subsequently restored and Facebook apologized for the business disruption. This incident was not isolated, as Facebook's automated systems have previously made similar errors, including removing excerpts from the Declaration of Independence in 2018 after flagging them as hate speech.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that fail to perform reliably or effectively under varying conditions, exposing them to errors and failures that can have significant consequences, especially in critical applications or areas that require moral reasoning.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed