Could Tyla’s AI version be in Hot Water

The rise of Artificial Intelligence (AI) Artistry

Social media has a fake AI version of South African artist Tyla’s remix of her hit song “Water” circulating.

The freedom to create has now opened up a can of worms for where AI productions will legally stand, as music regulations for AI generated songs have not yet been implemented.

Last week Tyla officially announced the release of her only remix with rapper Travis Scott. However, since then some confusion has arisen with a remix from what appears to be Beyonce’s vocals has gone viral. Most Beyoncé’s online fanbase will instantly know it’s not their queen.

In addition to the song release with the likeness of Beyonce’s voice, what sounds like Drake’s vocals also feature on the fake remix but do not state a disclosure that they are not featuring the original artists. Neither have any of the original artists’ or representatives commented.

Player1505, the music editor, and producer responsible for the production, created this AI vocal version of “Water”.

There are so many questions, what is the way forward for music copyright and AI. What will the future hold for music made with Artificial Intelligence?

In an article from Semafor called ‘AI-generated music is going viral. But is it legal?’ Marc Ostrow, a New York-based entertainment and copyright lawyer, told Semafor that the issue of AI and music is defined by a number of open questions that will likely be resolved “by some combination of litigation and/or legislation.” Here’s what to watch for.

Do AI music tools infringe on copyright?

The Recording Industry Association of America thinks so. It said last year that AI platforms that train their algorithms on existing songs infringe on the rights of artists who wrote and recorded them.

However, Ostrow said that owners of AI machinery could make the argument that this would be fair use. A landmark 2015 ruling — related to the legality of Google Books — found that a copyrighted work can be used without violating the law if it is “transformed” enough.

What about these sound-alike songs?

Since AI can so easily replicate the voices of popular artists, we could soon see a proliferation of tracks that purport to be from real artists, like the fake Drake and The Weeknd song.

Ostrow said artists could have legal standing to sue the creators of these songs based on their “right to publicity,” which protects the likeness of celebrities. In the late 1980s, for example, Bette Midler successfully sued Ford Motor Co. after they used a singer that sounded like her in an ad.

The same standard could apply here, Ostrow said, if the AI tracks are being used for commercial purposes.

“If I’m releasing a record that sounds like Drake, but it’s not Drake, that’s clearly for commercial purposes,” he said.

But the laws on this vary from state to state, so “if you’re a celebrity, make sure you’re a resident and you die in a state that protects publicity rights after death,” he said.

What will streaming services do?

As more and more AI-generated music is uploaded online, streaming platforms like Spotify, Apple Music, and YouTube could have some legal protection.

Federal law states that the online platforms can’t be held liable for copyright infringement just because an illegal work is uploaded to their site, but they have to take it down if they get a request to do so from the copyright holder.

Still, the streaming services could get caught in the AI crossfires. Universal Music Group, a giant in the industry, recently urged Spotify and Apple Music to block AI platforms from scraping melodies and lyrics from their artists’ songs, the Financial Times reported.

Contribution by Carmen Santiago

Leave a Reply

Your email address will not be published. Required fields are marked *