It’s a distilled version of SDXL called SDXL Turbo that can run in a single step. Fast enough that it can generate while you type.
Always have to take what he says with a grain of salt. I remember last year when he said they were a few weeks away from releasing a model that was like 30x faster.
Well shit, they finally released the thing he was talking about a year ago.
Yeah.
Yeah, he seems to be a serial liar and a bit of a con-man.
What are the current VRAM requirements on these models? Like 20GB+?
I think you can get Stable Video Diffusion running on 8GB VRAM and you can get SD and SDXL running on just CPU with some caveats.
Huh, that’s pretty impressive.
@Even_Adder @Hubi Won’t run on my 4GB VRAM, it was worth a try! 😄
New Lemmy Post: According to CEO Emad Mostaque, Stability is Releasing a New Model Today (https://lemmy.dbzer0.com/post/9286401)
Tagging: #StableDiffusion(Replying in the OP of this thread (NOT THIS BOT!) will appear as a comment in the lemmy discussion.)
I am a FOSS bot. Check my README: https://github.com/db0/lemmy-tagginator/blob/main/README.md