While I think this is a bit overblown in sensationalism, any company that allows user generative AI, especially as open as using LoRas and any amount of checkpoints, needs to have very good protection against synthetic CSAM like this. To the best of my knowledge, only the AI Horde has taken this sufficiently seriously until now.
To be fair, the AI doesn’t have to see CSAM to be able to generate CSAM. It just has to understand the concept of child and the various lewd concepts, and it can then mix them together.
Which only further supports my opinion that no programmer should be considered employable until they’ve read and understood Mary Shelley’s Frankenstein, and understood that they are in Frankenstein’s position.