ChatGPT incorrectly told users that Soundslice's sheet music scanner supported ASCII tablature import functionality that did not exist, leading to user confusion and forcing the company to develop the feature.
Soundslice operates a sheet music scanner that digitizes music from photographs for listening, editing and practice. Over several months, the company noticed an unusual pattern in their error logs where users were uploading screenshots of ChatGPT sessions containing ASCII tablature instead of traditional sheet music notation. The company was initially mystified by these uploads until they discovered that ChatGPT was incorrectly instructing users to create Soundslice accounts and import ASCII tablature to hear audio playback. However, Soundslice had never supported ASCII tab functionality, meaning ChatGPT was providing false information about the service's capabilities. This misinformation created false expectations among users and made the company appear inadequate. Rather than simply disclaiming the false information, Soundslice decided to meet the market demand by developing a dedicated ASCII tab importer feature. The company notes this may be the first case of a feature being developed specifically because an AI system incorrectly claimed it existed.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that inadvertently generate or spread incorrect or deceptive information, which can lead to inaccurate beliefs in users and undermine their autonomy. Humans that make decisions based on false beliefs can experience physical, emotional or material harms
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed