TechSpot means tech analysis and advice you can trust. Read our ethics statement.
A hot potato: Record labels are just as hawkish as Nintendo when defending their intellectual property. However, where Nintendo generally pursues copyright violators, music groups often attack websites and content providers that are not breaking the law. Universal Music Group's latest offensive is demanding streaming platforms police their customers to prevent generative AI developers from using their service for training LLMs.
Universal Music Group wants music streaming platforms like Apple Music and Spotify to block AI services from scraping melodies and lyrics. The group, representing about one-third of the record industry, says AI companies like OpenAI are training their algorithms on its artists' intellectual property without authorization or compensation.
According to emails sent to various streaming services sometime in March and recently obtained by Financial Times, UMG is threatening legal action if streaming platforms do not take action to prevent AI purveyors from stealing copyrighted material.
"We have become aware that certain AI systems might have been trained on copyrighted content without obtaining the required consents from, or paying compensation to, the rightsholders who own or produce the content," the alleged email reads. "We will not hesitate to take steps to protect our rights and those of our artists."
Large language models like ChatGPT are not limited to parroting conversational dialog. Some developers are training them to interpret music and mimic various artists' lyrics, vocal styles, and compositions. Google's MusicLM is reportedly trained on 280,000 hours of music. This model supposedly allows for prompts like: "Write lyrics in the style of Iron Maiden, but sung in the style of Disturbed." However, the model has not produced anything close to original music. Instead, it regurgitates plagiarized versions of its training material.
me: "write poetic and abstract song lyrics with no inherent meaning in the style of bob dylan"– r y a n . r o b b y ðÂ¤Â (@ryanrobby) January 11, 2023
chatGPT: *plagiarizes bob dylan's most famous song word for word*ðÂÂ©ðÂÂ©ðÂÂ©@OpenAI pic.twitter.com/mrxWOH0gRc
Universal says that AI "ingesting" its artists' works in this way violates copyright law.
"We have a moral and commercial responsibility to our artists to work to prevent the unauthorized use of their music and to stop platforms from ingesting content that violates the rights of artists and other creators," a UMG spokesperson said. "We expect our platform partners will want to prevent their services from being used in ways that harm artists."
It's unclear at this point how much teeth the email warnings have or if platforms like Spotify even have a responsibility to prevent AI scraping. It comes down to a question: "How deep do we go in holding content providers responsible for how its customers use its product?" If a print newspaper's customer sets a house on fire using a bundle of newspapers as the kindling, is the publishing company responsible for the arson?
Apple Music, Spotify, and others pay royalties for the right to stream Universal's music. Meanwhile, AI developers like Google, OpenAI, and Microsoft use that content for LLM training. If that is the action that is breaking copyright law, why is UGM threatening the provider instead of the violator? It's not like these developers are anonymous users pirating music. They are easy-to-find, high-profile firms with a lot of money.
Spotify has declined to comment on the situation. Apple has not responded to requests for comment, and AppleInsider expects it to remain silent.