Google's ad platform served advertisements from major brands on AI-generated content farm websites that spread health misinformation, despite having policies against serving ads on low-quality automatically generated content.
Google's programmatic advertising product, Google Ads, which generated $168 billion in revenue last year, served advertisements on AI-generated content farm websites despite having policies prohibiting ads on 'spammy automatically generated content.' NewsGuard identified these problematic sites, finding that around a quarter featured programmatic ads from major brands. Of 393 ads from big brands found on AI-generated sites, 356 were served by Google. One specific example was MedicalOutline.com, an AI-written site that spread harmful health misinformation with headlines like 'Can lemon cure skin allergy?' and 'What are 5 natural remedies for ADHD?' This site featured advertisements from nine major brands including Citigroup, Subaru, and GNC, all served via Google's platform. After MIT Technology Review flagged these issues to Google, the company removed ads from many sites due to 'pervasive policy violations,' though ads remained visible on Medical Outline as of June 25. Google's policy communications manager stated they focus on content quality rather than how it was created, but acknowledged that bad actors may leverage generative AI to circumvent their enforcement systems.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that inadvertently generate or spread incorrect or deceptive information, which can lead to inaccurate beliefs in users and undermine their autonomy. Humans that make decisions based on false beliefs can experience physical, emotional or material harms
AI system
Due to a decision or action made by an AI system
Unintentional
Due to an unexpected outcome from pursuing a goal
Post-deployment
Occurring after the AI model has been trained and deployed
No population impact data reported.