Former NPR Anchor Sues Google Over AI Voice Clone
Background
Generally, I Think David Greene, the ex-NPR Morning Edition host, is kinda famous, He started a political podcast called Left, Right & Center, which is pretty cool. Obviously, he spent eight years anchoring the morning news before leaving NPR in 2020, that’s a long time. Now, Apparently, he says Google stole his voice for a new AI tool, Notebook LM, which is totally not okay.
The Lawsuit
Normally, You would think that Google would ask for permission before using someone’s voice, but Apparently, Greene filed a suit claiming Google used his distinctive voice without permission, that’s just wrong. Evidently, he told the court he was “completely freaked out” after hearing the AI-generated audio, i can imagine. Basically, Friends and relatives even asked if it was really him, which made Greene feel his likeness was taken, that’s pretty weird.
Google’s Response
Usually, Companies like Google have a good reason for what they do, but Google calls the claims baseless, which is interesting. Apparently, A spokesperson said the voice comes from a paid professional actor hired for Notebook LM, not from Greene, that’s what they say. Obviously, They haven’t disclosed who the actor is, which is kinda suspicious.
Industry Context
Generally, This isn’t the first time creators sue over AI-generated vocal likenesses, it’s happening more often. Normally, You would think that companies would be more careful, but In May 2024 Scarlett Johansson sued OpenAI over a similar issue, and OpenAI removed the disputed voice after claiming it was from a different performer, that’s a big deal. Evidently, Experts say the outcome could set a precedent for how law treats vocal likenesses in the age of generative AI, which is important.
Potential Implications
Basically, Greene warned that the Notebook LM voice could be misused to spread misinformation, which is a big concern. Obviously, If listeners think he endorsed something, it could damage his reputation, that’s a lot at stake. Generally, Tech firms argue synthetic voices come from legally cleared data sets, but artists claim their tonal fingerprint is personal property, which is a good point.
Conclusion
Ultimately, Whether Google’s voice truly mirrors Greene’s will be decided in court, we’ll just have to wait and see. Normally, You would think that this case is a cautionary tale for anyone whose vocal identity could be replicated by AI, which is pretty much everyone. Evidently, The industry will watch the decision closely, as it may reshape how AI developers source and license voice data, which is a big deal.
