As policy makers in the UK weigh how to regulate the AI industry, Nick Clegg, former UK deputy prime minister and former Meta executive, claimed a push for artist consent would “basically kill” the AI industry.

Speaking at an event promoting his new book, Clegg said the creative community should have the right to opt out of having their work used to train AI models. But he claimed it wasn’t feasible to ask for consent before ingesting their work first.

“I think the creative community wants to go a step further,” Clegg said according to The Times. “Quite a lot of voices say, ‘You can only train on my content, [if you] first ask’. And I have to say that strikes me as somewhat implausible because these systems train on vast amounts of data.”

“I just don’t know how you go around, asking everyone first. I just don’t see how that would work,” Clegg said. “And by the way if you did it in Britain and no one else did it, you would basically kill the AI industry in this country overnight.”

  • @HelixDab2@lemm.ee
    link
    fedilink
    255 days ago

    Nick Clegg says asking artists for use permission would ‘kill’ the AI industry

    I fail to see any downside to this.

  • @tuhriel@infosec.pub
    link
    fedilink
    315 days ago

    If your business modell only works if you don’t follow any moral or official laws…it shouldn’t exist!

    Unfortunately, capitalism doesn’t work like that…

  • I’m starting to think we need to reframe this a little. Stop referring to “artists”. It’s not just lone, artistic types that are getting screwed here, it’s literally everyone who has content that’s been exposed to the Internet. Artists, programmers, scientists, lawyers, individuals, companies… everyone. Stop framing this as “AI companies versus artists” and start talking about it as “AI companies versus intellectual property right holders”, because that’s what this is. The AI companies are choosing to ignore IP law because it benefits them. If anyone, in any other context, tried to use this as a legal defense they would be laughed out of the courtroom.

  • Kichae
    link
    fedilink
    English
    285 days ago

    I bet door-to-door salespeople would make way more money if they could just break into your homes, leave their junk on your table, and steal your credit card, and yet we don’t let them do that.

  • @IllNess@infosec.pub
    link
    fedilink
    215 days ago

    Speaking at an event promoting his new book, Clegg said the creative community should have the right to opt out of having their work used to train AI models.

    No, it should be the opposite. The creative community should have to opt in. AI can run off the uploaded pieces. Everything else is theft.

    But he claimed it wasn’t feasible to ask for consent before ingesting their work first.

    What the fuck…?! Send a fucking email. If you don’t get an answer, then it’s a “No”. Learn to take no for an answer.

    • @tuhriel@infosec.pub
      link
      fedilink
      235 days ago

      The big issue is that they don’t just “do not ask”, they also actively ignore if it if someone tells “no” upfront. E.g. in a robots.txt

  • Riskable
    link
    fedilink
    English
    3
    edit-2
    5 days ago

    From a copyright perspective, you don’t need to ask for permission to train an AI. It’s no different than taking a bunch of books you bought second-hand and throwing them into a blender. Since you’re not distributing anything when you do that you’re not violating anyone’s copyright.

    When the AI produces something though, that’s when it can run afoul of copyright. But only if it matches an existing copyrighted work close enough that a judge would say it’s a derivative work.

    You can’t copyright a style (writing, art, etc) but you can violate a copyright if you copy say, a mouse in the style of Mickey Mouse. So then the question—from a legal perspective—becomes: Do we treat AI like a Xerox copier or do we treat it like an artist?

    If we treat it like an artist the company that owns the AI will be responsible for copyright infringement whenever someone makes a derivative work by way of a prompt.

    If we treat it like a copier the person that wrote the prompt would be responsible (if they then distribute whatever was generated).

    • @BlameThePeacock@lemmy.ca
      link
      fedilink
      English
      25 days ago

      A realistic take on the situation.

      I fully agree, despite how much people hate AI, training itself isn’t infringement based on how copyright laws are written.

      I think we need to treat it as the copier situation, the person who is distributing the copyright infringing material is at fault, not the tool used to create it.

    • @jjjalljs@ttrpg.network
      link
      fedilink
      65 days ago

      no different than taking a bunch of books you bought second-hand and throwing them into a blender.

      They didn’t buy the books. They took them without permission.

  • will
    link
    fedilink
    English
    135 days ago

    Perhaps the government should collect money from the AI companies — they could call it something simple, like “taxes” — and distribute the money to anyone who had ever written something that made its way to the internet (since we can reasonably assume that everything posted online has now been sucked in to the slop machines)

    • Avid Amoeba
      link
      fedilink
      45 days ago

      What’s a fucking shocking idea right? My mind is blown and I’m sure Mr. Clegg would be ecstatic when we tell him about it! /s

      Greedy dumb mfkers.

    • @takeda@lemm.ee
      link
      fedilink
      10
      edit-2
      5 days ago

      I think the primary goal of LLM is to use it on social media to influence public opinion.

      Notice that all companies that have social media are heavily invested in it. Also the recent fiasco with Grok taking about South African apartheid without being asked shows that such functionality is being added.

      I think talking about it to replace white collar jobs is a distraction. Maybe it can some, but the “daydreaming” (such a nice word for bullshit) I think makes the technology not very useful in that direction.

  • @hperrin@lemmy.ca
    link
    fedilink
    English
    9
    edit-2
    5 days ago

    Oh no wouldn’t that be a shame. /s

    I’m sorry but if your industry requires that you commit a bunch of crimes to make money, it’s not a legitimate industry, it’s a criminal industry. We’ve had these for a long time, and generally they’re frowned upon, because the crimes are usually drugs, guns, murder, sex trafficking, or theft. When the crime is intellectual property theft, apparently we forget to care. Then again, same with wage theft.