NPR host David Greene sued Google alleging that its NotebookLM AI tool's male podcast voice was trained on his voice without permission, with multiple colleagues and listeners identifying the resemblance.
In fall 2024, David Greene, a veteran public radio host who worked at NPR's 'Morning Edition' and KCRW's 'Left, Right & Center,' received emails from colleagues asking if he had licensed his voice to Google's NotebookLM AI tool. The tool's Audio Overview feature generates AI-powered podcasts with two virtual co-hosts, and multiple people identified the male voice as sounding like Greene. Greene filed a lawsuit in Santa Clara County Superior Court alleging Google violated his rights by replicating his voice without permission or payment. An unnamed AI forensic firm gave a 53-60% confidence rating that Greene's voice was used to train the model. Google denied the allegations, stating the voice is based on a paid professional actor they hired. NotebookLM's Audio Overview feature launched in 2024 and became popular for summarizing documents into podcast format, with Spotify even using it for their Spotify Wrapped feature in December 2024. The case raises questions about voice rights and AI training, similar to previous cases involving Bette Midler and Scarlett Johansson.
Domain classification, causal taxonomy, severity scores, and national security assessments were LLM-classified and may contain errors.
AI systems that memorize and leak sensitive personal data or infer private information about individuals without their consent. Unexpected or unauthorized sharing of data and information can compromise user expectation of privacy, assist identity theft, or cause loss of confidential intellectual property.
AI system
Due to a decision or action made by an AI system
Other
Without clearly specifying the intentionality
Post-deployment
Occurring after the AI model has been trained and deployed