- AI-generated covers of popular songs are going viral, with fans recently praising a cover of Beyoncé’s “Cuff It” featuring Rihanna’s vocals.
- The phenomenon could open up a new legal nightmare for the music industry.
- “There are all sorts of grounds for infringement proceedings,” Alexander Ross, a music and copyright lawyer, told Insider.
An AI-generated cover of Beyoncé’s latest hit “Cuff It” featuring Rihanna’s vocals is going viral, but lawyers say that such covers of popular songs could open up new legal issues for the music industry.
On Thursday, a Twitter account by the name of “Rihanna Facts” shared a snippet of the cover, which it claimed was generated by the artificial intelligence chatbot ChatGPT.
The clip has been viewed over 850,000 times and prompted plenty of reaction, including from “Nope” actress Keke Palmer.
“That Rihanna AI is eating all y’all songs up,” she tweeted.
—Rihanna Facts (@Nevernyny) April 13, 2023
The Rihanna “Cuff It” cover isn’t the only AI cover to have gone viral recently.
An AI-generated version of Ye, formerly known as Kanye West, covering the Plain White T’s’ “Hey There Delilah” has also amassed over 500,000 views on YouTube, while an AI Ye cover of Drake’s “Passionfruit” has been viewed over 300,000 times.
While the covers (some of them, anyway) might be nice to listen to, veteran music and copyright lawyer Alexander Ross, who is a partner at UK law firm Wiggin and has 25 years of experience working in music law, told Insider that the makers of the covers could be breaking the law.
Ross said the first issue lies in whether the person using tools such as ChatGPT to create the cover is making clear that the voice being used is AI-generated.
“If you’re creating a recording with the intention of misleading people into thinking it’s the real thing — that it is Rihanna, for example, then that’s called a passing-off claim,” he said. “You’re passing-off that as the original.”
“On the other hand, if it’s very clear that you are doing it as an AI exercise then and nobody is misled into thinking it could be Rihanna or whoever, then there is no passing-off claim,” he added. “The creator is then back to the rather simple matter of basic copyright law.”
Speaking of copyright law, creators of AI covers could also face legal action should they not obtain the correct permissions to use the underlying recording, said Ross.
In the case of the AI Rihanna cover of “Cuff It,” for example, the creator must have obtained permission from Beyoncé to use the track and be paying her royalties for doing so.
“If they have pinched the instrumental, or part of, from the original Beyoncé recording, that’s copyright infringement in a number of ways,” Ross explained. “You’ve stolen part of the recording and you’ve distributed it, communicated it with the public. There are all sorts of grounds for infringement proceedings there.”
Ross added that AI cover creators could avoid this issue if they recreate the backing track themselves, so long as they abide by the basic traditional rules of cover recordings, which include notifying the original artist of the cover, obtaining a mechanical license, and paying royalties.
“If you’d actually made a completely new recording of the music, with the AI voice, making clear it’s AI, then that’s a pure cover,” he said.
It’s not just AI covers that are creating a legal conundrum for the music industry.
There are a number of AI companies on the market that create new music by uploading pre-existing music into their technology, which can then digest the lyrics and music and create songs or melodies in those styles.
In 2021, for example, a mental health organization used Google’s AI product Magenta to produce a song called “Drowned in the Sun” by inputting data from dozens of original Nirvana recordings.
The result was a new, computer-generated, Nirvana-sounding song. The lead singer of a Nirvana tribute band provided the vocals.
Magenta has also been used to create AI songs that sound like those by Amy Winehouse, Jimi Hendrix, and Jim Morrison.
As a result of such technologies, Universal Music Group, the world’s largest music company, has asked major streaming services like Spotify and Apple Music to block AI companies from using their music to “train” their technologies.
“We have a moral and commercial responsibility to our artists to work to prevent the unauthorized use of their music and to stop platforms from ingesting content that violates the rights of artists and other creators,” a UMG spokesperson told the Financial Times. “We expect our platform partners will want to prevent their services from being used in ways that harm artists.”
Music lawyer Elliot Chalmers, the founder of Independent Music Law Advice, told Insider that, unlike direct covers that are made using AI, it is more difficult to prosecute those who create music that simply sounds like that of another artist, no matter how similar it may be.
“The technology can be used to basically recreate a song that everyone will know, that sounds like another song but legally isn’t,” said Elliot. “It isn’t using the same structure. It will be run through a musicologist. They can cover themselves legally through that.”
“It’s not necessarily a new phenomenon, though,” he added. “Because ultimately, it doesn’t matter who’s doing it — it’s just a new way of doing it, and obviously a quicker way that doesn’t involve humans having to come up with stuff.”