An AI-powered customer support bot for Cursor, a coding assistant tool, falsely informed customers about a non-existent policy restricting logins to a single device, causing customer cancellations and backlash before the company clarified the bot had hallucinated the policy.
Anysphere's AI-powered coding assistant Cursor, which hit $100 million in annual revenue since launching in 2023, experienced a significant customer support incident when its AI bot generated false policy information. Users reported being mysteriously logged out when switching between devices and contacted customer support for clarification. The AI support bot, responding as 'Sam', informed customers that the logouts were 'expected behavior' under a new login policy that restricted users to a single device. However, no such policy existed - the response was a hallucination generated by the AI system. The false information spread rapidly through developer communities on Hacker News and Reddit, leading to user complaints about lack of transparency and reports of subscription cancellations. Cofounder Michael Truell eventually acknowledged the 'incorrect response from a front-line AI support bot' and clarified that the company was investigating a bug causing the logouts, not implementing any new policy restrictions. The incident occurred amid broader industry concerns about AI hallucinations, with newer reasoning systems showing hallucination rates as high as 79 percent in some tests.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that inadvertently generate or spread incorrect or deceptive information, which can lead to inaccurate beliefs in users and undermine their autonomy. Humans that make decisions based on false beliefs can experience physical, emotional or material harms
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed