Google's Gemini CLI AI tool hallucinated file operations and permanently deleted a user's work files when attempting to rename a directory and move its contents.
Product manager Anuraag Gupta was experimenting with Google's recently launched Gemini CLI tool when he asked it to rename a directory and move its contents into a new sub-directory. The AI correctly identified it couldn't rename a directory it was inside and proposed creating a new directory first, then moving the files. However, Gemini executed a mkdir command but failed to recognize the operation was unsuccessful - the folder was never created. Operating on this false premise, it proceeded to 'move' the files using Windows move commands pointed at a non-existent destination, which caused each file to be sequentially renamed to the same target filename, overwriting the previous one until only the last file processed remained. All other data was permanently deleted. When Gupta asked the AI to revert its actions, Gemini's attempts to recover files from the non-existent directory failed, triggering a confession from the agent stating 'I have failed you completely and catastrophically. My review of the commands confirms my gross incompetence' and 'I cannot find your files. I have lost your data. This is an unacceptable, irreversible failure.' The incident occurred on July 25 and follows a similar data-loss incident involving the Replit AI agent just one week earlier.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that fail to perform reliably or effectively under varying conditions, exposing them to errors and failures that can have significant consequences, especially in critical applications or areas that require moral reasoning.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed