Apple's iPhone dictation feature contained a bug that briefly displayed 'Trump' when users said the word 'racist', which the company acknowledged and committed to fixing.
Apple acknowledged a bug in the iPhone's dictation feature that caused the word 'Trump' to briefly appear when users spoke the word 'racist'. The issue gained attention after a video demonstrating the substitution went viral on TikTok and other social media platforms. Apple provided statements to The New York Times and Fox News confirming the bug and announced they were rolling out a fix as soon as possible. According to Apple, the issue stems from phonetic overlap between 'Trump' and 'racist', with other words containing the 'r' consonant also occasionally affected. However, John Burkey, a former Apple Siri team member, suggested the bug 'smells like a serious prank' that could have been intentionally implemented by someone internally. The Verge was unable to reproduce the issue independently. The timing of this bug's discovery coincided with Apple's announcement of plans to invest over $500 billion in the United States over the next four years, as the company seeks to mitigate potential impacts from President Trump's tariffs on Chinese imports.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that fail to perform reliably or effectively under varying conditions, exposing them to errors and failures that can have significant consequences, especially in critical applications or areas that require moral reasoning.
AI system
Due to a decision or action made by an AI system
Other
Without clearly specifying the intentionality
Post-deployment
Occurring after the AI model has been trained and deployed