Apparently data centers routinely burn through water at a rate of about 1.9 liters per KWh of energy spent computing. Yet I can 🎮 HARDCORE GAME 🎮 on my hundreds-of-watts GPU for several hours, without pouring any of my Mountain Dew into the computer? Even if the PC is water cooled, the water cooling water stays in the computer, except for exceptional circumstances.
Meanwhile, water comes out of my A/C unit and makes the ground around it all muddy.
How am I running circles around the water efficiency of a huge AI data center, with an overall negative water consumption?
Collect and condense the hot water vapor, concentrate the heat until you’ve got steam; then pump it through a steam turbine recapturing that energy as electricity.
I’m sure there’s some difficulties and nuances I’m not seeing right away, but it would be nice to see some sort of system like this. Most power plants generate heat, then turn that into electricity. Data centers take electricity and turn it back into heat. There’s gotta be a way to combine the two concepts.
The difficulty is, to put it in very simple terms, is that physics doesn’t allow that. The less simple explanation is a thermodynamics textbook, and trust me, you don’t want that.
Everything generates heat. Everything. Everything. Anything that seems to generate “cold” is generating more heat somewhere else.
Yeah, thermodynamics are a thing. I’m not trying to claim some free energy system saying you could power the whole data center; but if you could re-capture some of the waste heat and convert it back into electricity, putting that energy to work instead of just venting to atmosphere, it could potentially help offset some of the raw electrical needs. An efficiency improvement, that’s all.
I’m an actual engineer with a degree and everything, although this is not my area of expertise, it’s one I’m familiar with.
They could do something like you suggest, but every step becomes more expensive and less effective. The exhaust from a coal fired power plant is still plenty hot, and more energy could be extracted from it. But it requires more and more to make less and less.
The curse of every engineer is to see a way to them every waste stream into a useful product, but not being able to do so profitably. (Which means no-one will approve the project)
Yeah, if we just take heat pumps for example, or even cpu water coolers, the heat is carried away from where it’s hot to somewhere where it can be radiated off and equilibrium of heat conducting material and surrounding occurs.
You can bet your ass that these US data center are just brute forcing heat exchange via evaporation instead to make the initial investment cheaper. It’s the equivalent to burning coal instead of straight up going for the renewable but initially more costly option when it comes to energy production.