More than 200 artists – including Nicki Minaj, Jonas Brothers, Chuck D, Sam Smith, Katy Perry and Zayn Malik – have signed an open letter that calls on technology companies to “cease the use of artificial intelligence to infringe upon and devalue the rights of human artists”. 

Although insisting that “when used responsibly, AI has enormous potential to advance human creativity”, the letter urges “AI developers, technology companies, platforms and digital music services” to “pledge that they will not develop or deploy AI music-generation technology, content or tools that undermine or replace the human artistry of songwriters and artists or deny us fair compensation for our work”. 

It’s by no means the first such demand from the music industry, which has been clear about what it sees as the obligations of companies that are developing or employing generative AI. 

The music industry holds the position that in order to legally train a generative AI model with existing music, a company must first get permission. That includes training using sound recordings, songs or any other assets that contain an artist’s voice or likeness. Which means deals need to be negotiated with relevant rightsholders, which may include the artists themselves. However, whatever the music industry thinks, this is still a legal grey area. 

This new letter specifically expresses concern about digital platforms that are not only using unlicensed music to train AI models, but which are then using the outputs of those models to create “sounds” and “images” that are “directly aimed at replacing the work of human artists” and which “substantially dilute the royalty pools that are paid out to artists”.

Although the letter doesn’t make any specific allegations against any specific digital platforms, it does seem to reflect concerns that some user-generated content platforms are maybe developing their own AI tools that generate audio and music that creators can use in their videos instead of commercially-released music. Which means those platforms are less reliant on tracks that need licensing from the music industry. 

When Universal Music pulled all of its music from TikTok earlier this year, it stressed that concerns around AI were part of its dispute with the social media firm, as well as the core disagreement over how much money the major should be paid when its music is used in TikTok videos. 

In a subsequent letter to its songwriters, Universal stated, “While refusing to respond to our concerns about AI depriving songwriters from fair compensation, or provide assurances that they will not train their AI models on your songs, recent media reports reveal ‘TikTok … leaders have long wanted to move the app beyond music’ [and] ‘TikTok has an incentive to push the use of these AI recordings rather than the copyrighted and licensed recordings’. Every indication is that they simply do not value your music”. 

The new open letter, which was organised by the Artist Rights Alliance in the US, goes on to say, “Unchecked, AI will set in motion a race to the bottom that will degrade the value of our work and prevent us from being fairly compensated for it. This assault on human creativity must be stopped. We must protect against the predatory use of AI to steal professional artists’ voices and likenesses, violate creators’ rights, and destroy the music ecosystem”.

Artists and songwriters are generally allied with record labels and music publishers when it comes to the obligations of AI companies, insisting that those companies must get consent before utilising existing music, and that they should be fully transparent about what music they have used in any training processes. 

Although, in the UK, the Council Of Music Makers has also called on labels and publishers that might be looking to negotiate licensing deals with AI companies to make similar commitments. So, to secure the consent of artists and songwriters before any music is used in AI training, and to also be fully transparent about what music is being used in the AI domain and on what terms.