A new law in Tennessee aimed at protecting artists from AI-powered voice mimicry has won widespread acclaim from the music industry, but some legal experts are worried such laws might be an “overreaction” that could have unintended consequences.
Less than a year after a fake Drake song created using new artificial intelligence tools took the music world by storm, Tennessee lawmakers enacted first-in-the-nation legislation last month aimed at preventing exactly that scenario — the use of a person’s voice without their permission. The ELVIS Act (Ensuring Likeness Voice and Image Security) does that by expanding the state’s protections against the unauthorized use of a person’s likeness, known as publicity rights.
The passage of the new law was hailed across the music business. Mitch Glazier of the Recording Industry Association of America called it an “incredible result.” Harvey Mason Jr. of the Recording Academy described it as a “groundbreaking achievement.” David Israelite of the National Music Publishers’ Association called it “an important step forward.” Any musical artist who has had their voice used without permission likely shares those sentiments.
But legal experts are more divided. Jennifer Rothman, a law professor at the University of Pennsylvania and one of the country’s top experts on publicity rights, rang alarm bells last week at a panel discussion in Nashville, warning that Tennessee’s new statute had not been necessary and had been “rushed” into law.
“We don’t want a momentary overreaction to lead to the passage of laws that would make things worse, which is currently what is happening,” Rothman told her fellow panel members and the audience. “The ELVIS Act has a number of significant concerns that are raised, particularly with the broad sweep of liability and restrictions on speech.”
In an effort to combat AI voice cloning, the ELVIS Act makes a number of key changes to the law. Most directly, it expands the state’s existing publicity rights protections to explicitly include someone’s voice as part of their likeness. But the new law also expands the law in ways that have received less attention, including adding a broader definition of who can be sued and for what.
According to Joseph Fishman, a law professor at Vanderbilt University who has been closely tracking the legislation, that broader wording “sweeps in innocuous behavior that no one seriously thinks is a problem that needs solving” — potentially including tribute bands, interpolations, or even just sharing a photo that a celebrity didn’t authorize.
“The range of acts that trigger liability is vast,” Fishman tells Billboard. “All the press around this law is focused on deepfakes and digital replicas — and those would indeed be covered — but the law as written goes so much further.”
Here’s why: Historically, publicity rights in the U.S. have been mostly limited to commercial contexts — like advertisements that use a celebrity’s likeness to make it appear they’re endorsing a product. The singer Bette Midler once famously sued the Ford Motor Co. over a series of commercials featuring vocals by a Midler impersonator.
The new law effectively gets rid of that commercial limitation; under the ELVIS Act, anyone who knowingly “makes available” someone’s likeness without authorization can face a lawsuit. It also broadly defines protected voices as any sound that’s “readily identifiable and attributable to a particular individual.”
Those are great changes if you’re a musical artist trying to sue over a song that’s using a fake version of your voice, since the old conception of publicity rights likely wouldn’t apply to that scenario. But Fishman says they have serious potential for collateral damage beyond their intended target.
“There’s nothing that would limit it to AI outputs, nothing that would limit it to deceptive uses,” Fishman said. “The lead singer in an Elvis tribute band who sings convincingly like The King certainly seems to fall under the definition. So do Elvis impersonators.”
In an “even more extreme” hypothetical, Fishman imagined an “unflattering” photo of Elvis that he knew the Presley estate didn’t like. “The law seems to say I’d be liable if I sent that photo to a friend. After all, I’m transmitting his likeness, knowing that the rightsholder hasn’t authorized the use. Stop and think about that for a moment.”
The ELVIS Act does contain exemptions aimed at protecting free speech, including those that allow for the legal use of someone’s likeness in news coverage, criticism, scholarship, parody and other “fair use” contexts. It also expressly allows for “audiovisual works” that contain “a representation of the individual as the individual’s self” — a provision likely aimed at allowing Hollywood to keep making biopics and other films about real people without getting sued in Tennessee.
But confusingly, the law says those exemptions only apply “to the extent such use is protected by the First Amendment.” That wording, according to Rothman, means those exemptions essentially “don’t exist” unless and until a court rules that a specific alleged activity is a form of protected free speech, a costly extra step that will mostly benefit those who want to be in court. “This specific law creates great work for lawyers,” Rothman said. “So much work for lawyers.”
Those lawyers are going to be filing real lawsuits against real people — some of whom are the scary, voice-cloning bad actors that the music industry wants to crack down on, but also some of whom are likely just regular people doing things that used to be legal.
“The law could absolutely lead to lots of lawsuits,” Fishman says. “There’s plenty of room here for people to test how far the statute can go, whether because they object to how they’re being depicted or because they see an opportunity for an extra licensing stream.”
Though it only applies to Tennessee, the importance of the ELVIS Act is magnified because it is the first of likely many such legislative efforts aimed at addressing AI mimicry. At least five other states are currently considering amending their publicity rights laws to address the growing problem, and lawmakers on Capitol Hill are also weighing federal legislation that would create a national likeness statute for the first time.
At last week’s roundtable, Rothman said those efforts were misguided. She said that laws already on the books — including federal trademark law, existing publicity rights laws, and numerous other statutes and torts — already provide avenues to stop voice cloning and deepfakes. And she warned that the proposed federal bills posed even more serious problems, like allowing someone to sign away their likeness rights in perpetuity.
For other legal experts critical of the ELVIS Act, including Harvard University law professor Rebecca Tushnet, the hope is that any subsequent legislation, whether at the state or federal level, can be more directly tailored to the actual AI-fueled deceptions they’re supposed to address.
“Any new laws need to be far more targeted at specific harms,” says Tushnet, who has written extensively about the intersection of intellectual property and free speech. “Right now, this statute and other proposals are dramatically overbroad, and threaten legitimate creative conduct.”