Creators Demand Tech Giants Fess Up, Pay For All That AI Training Data
The Register highlights concerns raised at a recent UK parliamentary committee regarding AI companies' exploitation of copyrighted content without permission or payment. From the report: The Culture, Media and Sport Committee and Science, Innovation and Technology Committee asked composer Max Richter how he would know if "bad-faith actors" were using his material to train AI models. "There's really nothing I can do," he told MPs. "There are a couple of music AI models, and it's perfectly easy to make them generate a piece of music that sounds uncannily like me. That wouldn't be possible unless it had hoovered up my stuff without asking me and without paying for it. That's happening on a huge scale. It's obviously happened to basically every artist whose work is on the internet." Richter, whose work has been used in a number of major film and television scores, said the consequences for creative musicians and composers would be dire. "You're going to get a vanilla-ization of music culture as automated material starts to edge out human creators, and you're also going to get an impoverishing of human creators," he said. "It's worth remembering that the music business in the UK is a real success story. It's 7.6 billion-pound income last year, with over 200,000 people employed. That is a big impact. If we allow the erosion of copyright, which is really how value is created in the music sector, then we're going to be in a position where there won't be artists in the future." Speaking earlier, former Google staffer James Smith said much of the damage from text and data mining had likely already been done. "The original sin, if you like, has happened," said Smith, co-founder and chief executive of Human Native AI. "The question is, how do we move forward? I would like to see the government put more effort into supporting licensing as a viable alternative monetization model for the internet in the age of these new AI agents." Matt Rogerson, director of global public policy and platform strategy at the Financial Times, said: "We can only deal with what we see in front of us and [that is] people taking our content, using it for the training, using it in substitutional ways. So from our perspective, we'll prosecute the same argument in every country where we operate, where we see our content being stolen." The risk, if the situation continued, was a hollowing out of creative and information industries, he said. [...] "The problem is we can't see who's stolen our content. We're just at this stage where these very large companies, which usually make margins of 90 percent, might have to take some smaller margin, and that's clearly going to be upsetting for their investors. But that doesn't mean they shouldn't. It's just a question of right and wrong and where we pitch this debate. Unfortunately, the government has pitched it in thinking that you can't reduce the margin of these big tech companies; otherwise, they won't build a datacenter." Read more of this story at Slashdot.
Read more of this story at Slashdot.