While I think this is a bit overblown in sensationalism, any company that allows user generative AI, especially as open as using LoRas and any amount of checkpoints, needs to have very good protection against synthetic CSAM like this. To the best of my knowledge, only the AI Horde has taken this sufficiently seriously until now.

  • db0@lemmy.dbzer0.comOPM
    link
    fedilink
    English
    arrow-up
    13
    ·
    11 months ago

    To be fair, the AI doesn’t have to see CSAM to be able to generate CSAM. It just has to understand the concept of child and the various lewd concepts, and it can then mix them together.

    • starbreaker@kbin.social
      link
      fedilink
      arrow-up
      3
      arrow-down
      8
      ·
      11 months ago

      Which only further supports my opinion that no programmer should be considered employable until they’ve read and understood Mary Shelley’s Frankenstein, and understood that they are in Frankenstein’s position.