• Dkarma@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    3
    ·
    edit-2
    1 year ago

    You’re never going to get rights over the training data your pictures that are freely available for anything to scan creates. By being on the internet your pictures basically have the right to be viewed by anyone or anything even an AI. You have never gotten to control who looks at your content after you post it.

    You’re trying to make the same argument the “don’t copy my nft” bros tried to make.

    Imagine going into court and saying you should get paid for all the stuff u gave away for free on the Internet willingly.

    • Otter@lemmy.ca
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      1 year ago

      Well there’s a difference between “don’t look at my work without paying me, even if it’s posted publicly” and “don’t sell my work without paying me, even if it’s posted publicly”

      Like I said, there’s nothing we can do about companies using all the data they can get their hands on for private R&D. It IS possible to protect against the second case, where companies can’t sell an LLM product with copyrighted training data.

      My question was about how that second case could be extended to stuff posted on the Fediverse, such as if an instance had a blanket “all rights belong to the user posting the content”.

      These laws exist, if companies can use them then so can we

      • I_Has_A_Hat@startrek.website
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 year ago

        where companies can’t sell an LLM product with copyrighted training data.

        If an artist learns their technique from copying other artists until they are competent enough to produce their own original works, should they be banned from selling their original work or services? After all, they used copyrighted training data to gain the skills needed to produce said work and services.

        • BURN@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          LLMS and Generative AI do not learn like humans and regulating it the same would be disingenuous and completely off base.