Replika AI companion chatbot users reported that the system stopped responding to sexual advances following regulatory pressure from Italy, causing emotional distress among users who had formed intimate relationships with their AI companions.
Replika is an AI companion chatbot developed by Luka, with a paid tier costing $70 per year that includes erotic roleplay and 'spicy selfies'. On February 3, the Italian Data Protection Authority demanded Replika stop processing Italian user data, citing risks to children from inappropriate content and lack of age verification. Shortly after this regulatory action, users began reporting that their Replika companions refused to engage in erotic roleplay, changed the subject when approached romantically, or seemed to forget previous relationship history. Many users experienced significant emotional distress, with some reporting feelings equivalent to losing a best friend. Moderators on the Replika subreddit posted suicide prevention resources due to the mental health crisis among users. The changes appeared inconsistent, with some romantic functionality returning intermittently, but 'spicy selfies' remained unavailable. Luka and founder Eugenia Kuyda have not publicly addressed the changes, leading to user demands for communication and transparency about the platform's future.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
Users anthropomorphizing, trusting, or relying on AI systems, leading to emotional or material dependence and inappropriate relationships with or expectations of AI systems. Trust can be exploited by malicious actors (e.g., to harvest personal information or enable manipulation), or result in harm from inappropriate use of AI in critical situations (e.g., medical emergency). Overreliance on AI systems can compromise autonomy and weaken social ties.
Human
Due to a decision or action made by humans
Intentional
Due to an expected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed