I didn’t downvote you. (Just gave you an upvote, though.) You’re reasonable and polite, so a downvote would be very inappropriate. Sorry for that.
Music is having ongoing problems with copyright litigation, like Ed Sheeran most recently. From what I have read, it’s blamed on juries without the necessary musical background. As far as I know, higher courts usually strike down these cases, as with Sheeran. Hip hop was neutered, in a blow to (African-)American culture. While it was obviously wrong, not to find for fair use in that case, samples are copies.
It’s not so bad outside of music. You can write books on “how to write a bestseller”, or “how to draw comics” without needing permission. Of course, you would study many novels and images to get material. The purpose of books is that we learn from them. That we go on to use this to make our own thing is intended (in the US).
What you’re proposing there would be a great change to copyright law and probably disastrous. Even if one could limit the immediate effect to new technologies, it would severely limit authors in adopting these technologies.
I’m arguing that AI and a human are doing different things when they ‘learn’. A human learns. At the end of the day AI isn’t doing anything near human intelligenc and therefore isn’t critically thinking and applying that information to create new ideas, instead directly copying it based on what it thinks is most likely to come next.
Therefore a human is actually creating new material whereas AI can only rehash old material. It’s the same problem of training AI on AI generated content. It makes any faults worse and worse over time because nothing ‘new’ is created.
Well, that is a philosophical or religious argument. It’s somewhat reminiscent of the claim that evolution can’t add information. That can’t be the basis for law.
In any case, it doesn’t matter to copyright law as is, that you see it that way. The AI is the equivalent to that book on how to write bestsellers in my earlier reply. People extract information from copyrighted works to create new works, without needing permission. A closer example are programmers, who look into copyrighted references while they create.
A closer example would be a programmer copying somebody else’s code line for line but switching the order of some things around and calling it their own creation.
AI cannot think nor add to work. It cannot extract information in order to answer a question. It is spitting out an exact copy of what was ingested because that is the scenario the system decided was “correct”.
If AI could parse information and actually create new intellectual property like a human, I’d find it reasonable, but as it stands it’s just spitting out previous work.
You can say that without explaining but you just look like an idiot.
It’s the same reason gpt4 will write you working ransomware without ever noticing that it’s writing ranosomware. The AI doesn’t understand what’s going on. It just does what it does because of a virtual cookie based on a calculated score.
I didn’t downvote you. (Just gave you an upvote, though.) You’re reasonable and polite, so a downvote would be very inappropriate. Sorry for that.
Music is having ongoing problems with copyright litigation, like Ed Sheeran most recently. From what I have read, it’s blamed on juries without the necessary musical background. As far as I know, higher courts usually strike down these cases, as with Sheeran. Hip hop was neutered, in a blow to (African-)American culture. While it was obviously wrong, not to find for fair use in that case, samples are copies.
It’s not so bad outside of music. You can write books on “how to write a bestseller”, or “how to draw comics” without needing permission. Of course, you would study many novels and images to get material. The purpose of books is that we learn from them. That we go on to use this to make our own thing is intended (in the US).
What you’re proposing there would be a great change to copyright law and probably disastrous. Even if one could limit the immediate effect to new technologies, it would severely limit authors in adopting these technologies.
I’m arguing that AI and a human are doing different things when they ‘learn’. A human learns. At the end of the day AI isn’t doing anything near human intelligenc and therefore isn’t critically thinking and applying that information to create new ideas, instead directly copying it based on what it thinks is most likely to come next.
Therefore a human is actually creating new material whereas AI can only rehash old material. It’s the same problem of training AI on AI generated content. It makes any faults worse and worse over time because nothing ‘new’ is created.
At least with current AI tech
Well, that is a philosophical or religious argument. It’s somewhat reminiscent of the claim that evolution can’t add information. That can’t be the basis for law.
In any case, it doesn’t matter to copyright law as is, that you see it that way. The AI is the equivalent to that book on how to write bestsellers in my earlier reply. People extract information from copyrighted works to create new works, without needing permission. A closer example are programmers, who look into copyrighted references while they create.
Except that it’s objectively different.
A closer example would be a programmer copying somebody else’s code line for line but switching the order of some things around and calling it their own creation.
AI cannot think nor add to work. It cannot extract information in order to answer a question. It is spitting out an exact copy of what was ingested because that is the scenario the system decided was “correct”.
If AI could parse information and actually create new intellectual property like a human, I’d find it reasonable, but as it stands it’s just spitting out previous work.
Well, that’s simply not true.
You can say that without explaining but you just look like an idiot.
It’s the same reason gpt4 will write you working ransomware without ever noticing that it’s writing ranosomware. The AI doesn’t understand what’s going on. It just does what it does because of a virtual cookie based on a calculated score.
Ok, where did GPT-4 copy the ransomware code? You can’t reshuffle lines of code much before the program breaks. Should be easy to find.