• Glitterkoe@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    6 hours ago

    And then you have a trained model that requires vast amounts of energy per request, right? It doesn’t stop at training.

    You need obscene amounts GPU power to run the ‘better’ models within reasonable response times.

    In comparison, I could game on my modest rig just fine, but I can’t run a 22B model locally in any useful capacity while programming.

    Sure, you could argue gaming is a waste of energy, but that doesn’t mean we can’t argue that it shouldn’t have to cost boiling a shitload of eggs to ask AI how long a single one should. Or each time I start typing a line of code for that matter.