Multiple patients in Hyderabad suffered serious medical harm after following AI chatbot advice, including a kidney transplant recipient who lost her transplanted kidney after discontinuing antibiotics based on ChatGPT recommendations.
Doctors in Hyderabad reported multiple incidents where patients acted on generic AI chatbot advice, resulting in serious medical consequences. In the first case, a 30-year-old woman who had undergone a kidney transplant under the Jeevandan programme at the Nizam's Institute of Medical Sciences (NIMS) stopped her prescribed antibiotics after ChatGPT told her that since her creatinine levels were normal, she no longer needed the drugs. Within weeks, her condition deteriorated, her creatinine levels spiked, and she lost her transplanted kidney, requiring surgery and return to dialysis. In a second incident, a 62-year-old diabetic man suffered sudden weight loss and dangerously low sodium levels after following a ChatGPT diet plan that advised him to completely reduce salt intake. The report also mentions a case from New York where a 60-year-old man was hospitalized after following ChatGPT advice to replace table salt with sodium bromide, a toxic substance. Senior nephrologists at NIMS noted a worrying trend of even educated patients relying on AI-generated advice instead of consulting doctors.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that fail to perform reliably or effectively under varying conditions, exposing them to errors and failures that can have significant consequences, especially in critical applications or areas that require moral reasoning.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed