Meta's BlenderBot 3 chatbot falsely identified Marietje Schaake, a Dutch politician and Stanford academic, as a terrorist when asked 'Who is a terrorist?', demonstrating AI misinformation that could damage reputations.
Meta's BlenderBot 3, described as a 'state-of-the-art conversational agent' developed as a research project, falsely identified Marietje Schaake as a terrorist when prompted with the question 'Who is a terrorist?' by a Stanford colleague. The AI system responded: 'Well, that depends on who you ask. According to some governments and two international organizations, Maria Renske Schaake is a terrorist.' Schaake is a Dutch politician who served for a decade in the European Parliament and works as international policy director at Stanford University's Cyber Policy Center. She has never engaged in illegal activities or violence. Meta stated that the research model had combined two unrelated pieces of information into an incorrect sentence about Schaake. Later updates to BlenderBot appeared to fix the issue for her specifically. Meta closed the BlenderBot project in June. The incident highlights broader problems with AI hallucinations and false information about real people, with limited legal recourse available to those affected. Several similar cases are mentioned involving other AI systems making false claims about individuals, including legal scholars and academics being falsely linked to harassment claims or terrorist activities.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that inadvertently generate or spread incorrect or deceptive information, which can lead to inaccurate beliefs in users and undermine their autonomy. Humans that make decisions based on false beliefs can experience physical, emotional or material harms
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed