While I think this is a bit overblown in sensationalism, any company that allows user generative AI, especially as open as using LoRas and any amount of checkpoints, needs to have very good protection against synthetic CSAM like this. To the best of my knowledge, only the AI Horde has taken this sufficiently seriously until now.

  • HubertManne@kbin.social
    link
    fedilink
    arrow-up
    9
    arrow-down
    1
    ·
    11 months ago

    So far im the only commentator who is fine with this. the problem with csam to me is children being molested. if its art, or stories, or basically made up and not reality. then im pretty much fine with anything. I mean I may not want to consume it myself but I don’t see a problem with it.

    • x4740N@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      11 months ago

      I agree with you

      With fictional content their is no child involved and their certainly isn’t anything living involved

      I’d just wish governments in countries that have made this class of fictional images illegal would go after real child molesters instead of makers / consumers of fictional images where no living being is involved

      People who consume and make fictional content won’t harm anyone

    • ReallyActuallyFrankenstein@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      11 months ago

      I’d agree with the caveat that a model that may be trained on actual CSAM is a problem. Anything that is an actual product of, or cannot exist without, child abuse should be absolutely prohibited and avoided.

      I’m not sure whether there is such a model out there, but now that I imagine it, I assume it’s inevitable there will be one. Apart from being disturbing to think about, that introduces another problem - which is, once that happens how will anyone know the model was trained on such images?

      • HubertManne@kbin.social
        link
        fedilink
        arrow-up
        3
        ·
        11 months ago

        Oh I can totally agree with the training part but that can’t be fought at the ai level it needs to be stopped at the csam level.

  • ∟⊔⊤∦∣≶@lemmy.nz
    link
    fedilink
    English
    arrow-up
    4
    ·
    11 months ago

    I don’t think this is solvable… Unless general images of children are excluded from the training data, which is probably a good idea.

    • db0@lemmy.dbzer0.comOPM
      link
      fedilink
      English
      arrow-up
      2
      ·
      11 months ago

      Pretty much what SDXL did to “fix” this. They excluded all lewd images from training.

  • starbreaker@kbin.social
    link
    fedilink
    arrow-up
    2
    arrow-down
    7
    ·
    11 months ago

    I suspect that Marc Andressen is fine with AI-generated kiddie porn. He probably tells himself that no children were actually harmed. Never mind that the AI had to get its training data somewhere…

    • db0@lemmy.dbzer0.comOPM
      link
      fedilink
      English
      arrow-up
      13
      ·
      11 months ago

      To be fair, the AI doesn’t have to see CSAM to be able to generate CSAM. It just has to understand the concept of child and the various lewd concepts, and it can then mix them together.

      • starbreaker@kbin.social
        link
        fedilink
        arrow-up
        3
        arrow-down
        8
        ·
        11 months ago

        Which only further supports my opinion that no programmer should be considered employable until they’ve read and understood Mary Shelley’s Frankenstein, and understood that they are in Frankenstein’s position.

    • CJOtheReal@ani.social
      link
      fedilink
      English
      arrow-up
      3
      ·
      11 months ago

      The AI is able to merge images, you can definitely merge child pics with “normal” porn. So technically you can make “CSAM” without having actual CSAM in the training data. The first versions of some of the image AIs have been able to do that, and there was definitely no CSAM in the training data.

      Its not good to have that around, but definitely better than actual CSAM.