Facebook's Year in Review feature automatically selected photos of deceased loved ones, including a user's daughter who died in 2014, causing emotional distress to users who experienced loss.
Facebook's Year in Review feature automatically generated photo compilations for users at the end of 2014, selecting images that received the most engagement through likes and comments. The algorithm automatically chose a photo of Eric Meyer's daughter Rebecca, who had died in 2014, as the featured image for his year review with the message 'Here's what your year looked like!' Meyer described this as 'inadvertent algorithmic cruelty' and noted the lack of opt-out options. Multiple other users reported similar experiences with photos of deceased pets, burning apartments, and other traumatic events being selected. Facebook's product manager Jonathan Gheller personally apologized to Meyer and acknowledged they could 'do better.' The feature was subsequently modified to use more neutral language, changing from 'It's been a great year! Thanks for being a part of it' to 'See you next year!' The incident highlighted the need for empathetic design that considers worst-case scenarios and provides users with control over automated content generation.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that fail to perform reliably or effectively under varying conditions, exposing them to errors and failures that can have significant consequences, especially in critical applications or areas that require moral reasoning.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed