Apparently data centers routinely burn through water at a rate of about 1.9 liters per KWh of energy spent computing. Yet I can 🎮 HARDCORE GAME 🎮 on my hundreds-of-watts GPU for several hours, without pouring any of my Mountain Dew into the computer? Even if the PC is water cooled, the water cooling water stays in the computer, except for exceptional circumstances.

Meanwhile, water comes out of my A/C unit and makes the ground around it all muddy.

How am I running circles around the water efficiency of a huge AI data center, with an overall negative water consumption?

  • Darkassassin07@lemmy.ca
    link
    fedilink
    English
    arrow-up
    2
    ·
    3 days ago

    Yeah, thermodynamics are a thing. I’m not trying to claim some free energy system saying you could power the whole data center; but if you could re-capture some of the waste heat and convert it back into electricity, putting that energy to work instead of just venting to atmosphere, it could potentially help offset some of the raw electrical needs. An efficiency improvement, that’s all.

    • SaltSong@startrek.website
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 days ago

      I’m an actual engineer with a degree and everything, although this is not my area of expertise, it’s one I’m familiar with.

      They could do something like you suggest, but every step becomes more expensive and less effective. The exhaust from a coal fired power plant is still plenty hot, and more energy could be extracted from it. But it requires more and more to make less and less.

      The curse of every engineer is to see a way to them every waste stream into a useful product, but not being able to do so profitably. (Which means no-one will approve the project)

    • dubyakay@lemmy.ca
      link
      fedilink
      arrow-up
      2
      ·
      3 days ago

      Yeah, if we just take heat pumps for example, or even cpu water coolers, the heat is carried away from where it’s hot to somewhere where it can be radiated off and equilibrium of heat conducting material and surrounding occurs.
      You can bet your ass that these US data center are just brute forcing heat exchange via evaporation instead to make the initial investment cheaper. It’s the equivalent to burning coal instead of straight up going for the renewable but initially more costly option when it comes to energy production.