Gannett newspaper chain paused use of LedeAI tool for writing high school sports articles after the AI system published reports with coding errors, repetitive language, and placeholder text that went viral on social media.
Gannett, a major newspaper chain, deployed an AI service called LedeAI to automatically generate high school sports dispatches across multiple local outlets including the Columbus Dispatch, Louisville Courier Journal, AZ Central, Florida Today, and Milwaukee Journal Sentinel. In August 2023, several AI-generated sports reports published by the Columbus Dispatch contained significant errors, including unprocessed placeholder text such as '[[WINNING_TEAM_MASCOT]]' and '[[LOSING_TEAM_MASCOT]]' that appeared in published articles. The reports were criticized on social media for being repetitive, lacking key details, using odd language, and featuring identical phrases across different stories such as 'high school football action' and 'cruise-control wins'. Many articles also repeated game dates multiple times within short paragraphs. After the errors went viral, Gannett paused the LedeAI experiment across all local markets that had been using the service. LedeAI CEO Jay Allred acknowledged the problems included 'errors, unwanted repetition and/or awkward phrasing' and the company launched efforts to correct the issues. Several published stories were subsequently updated with notes stating 'This AI-generated story has been updated to correct errors in coding, programming or style.' The incident occurred after Gannett had laid off 6% of its news division in December 2022.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that fail to perform reliably or effectively under varying conditions, exposing them to errors and failures that can have significant consequences, especially in critical applications or areas that require moral reasoning.
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed
No population impact data reported.