Anthropic has I reached a settlement It agreed to stop showing users music lyrics based on copyrighted songs from many music publishers. In 2023, it was an artificial intelligence company File a lawsuit against By Universal Music Group, Concord Music Group and others after it emerged that its Claude chatbot would repeat lyrics to songs like Beyoncé’s “Halo” when prompted.
The entertainment industry is one of the most controversial and fights hard to defend its copyrights — just look at the historical cases, from the destruction of Napster to Viacom’s multi-year legal battle against YouTube. Recently, the popular lyric annotation site Rap Genius (now known as Genius) was created. File a lawsuit against By the National Music Publishers Association to reproduce copyrighted song lyrics.
The music publishers who sued Anthropic acknowledged that other websites such as music annotation platform Genius distribute song lyrics online, but noted that Genius eventually began paying licensing fees to publish them on its website.
In this latest lawsuit, music publishers claimed that Anthropic scrubbed song lyrics from the web and intentionally removed watermarks placed on song lyric websites to help identify where copyrighted material was published. After Genius began licensing song lyrics from music publishers, it did so smartly Additional apostrophes are inserted In the lyrics so that, if the material is copied inappropriately, Genius will know that the material you explicitly paid for has been stolen and will be able to demand its removal.
Anthropic did not waive these claims, but as part of the settlement it agreed to better maintain guardrails that prevent its AI models from infringing copyrighted material. It will also work in good faith with music publishers when it turns out that guardrails are not working.
Anthropic motive The act of using song lyrics and other copyrighted material to train artificial intelligence models Hollywood ReporterHe added: “Our decision to enter into this requirement is consistent with those priorities.” We still look forward to showing that, consistent with current copyright law, the use of potentially copyrightable materials in training generative AI models is substantive fair use. This argument has been central to AI companies’ defense of copyrighted material that appears in their models. Proponents claim that remixing copyrighted content from websites such as New York Times It constitutes fair use as long as it is substantially modified by derivative works.
News and music publishers disagree, and the lawsuit against Anthropic is far from over. The music publishers are still seeking an injunction preventing Anthropic from training future models on any copyrighted music lyrics at all.
Concern about abuse stems from the possibility that anthropic models are used to generate music causing the musician to lose control of their artistic skills. This is not an unfounded concern, as it has been widely speculated that OpenAI imitated Scarlett Johansson’s voice after she refused to lend her voice to its AI voice model.
Tech companies like OpenAI and Google make their money from platforms and network effects, not by selling copyrighted material, which has always led to this tension between Hollywood and Silicon Valley. Art is just “content” meant to serve the larger purpose of generating engagement and selling advertising. The AI decline that fills Facebook today represents how technology companies view everything as interchangeable.
Publishers such as times They have fought high-profile battles against the likes of OpenAI in court to prevent them from seizing copyrighted material. OpenAI has tried to respond by licensing materials from some companies, and another AI company, Perplexity, has begun testing a revenue-sharing model. But publishers want more control and not be forced into these fragile deals that can expire at any time and continue to drive people away from their websites. This means that this is far from the end of the story when it comes to disputes over copyrighted material in large language models.
https://gizmodo.com/app/uploads/2025/01/GettyImages-1668660170.jpg
Source link