Apparently data centers routinely burn through water at a rate of about 1.9 liters per KWh of energy spent computing. Yet I can 🎮 HARDCORE GAME 🎮 on my hundreds-of-watts GPU for several hours, without pouring any of my Mountain Dew into the computer? Even if the PC is water cooled, the water cooling water stays in the computer, except for exceptional circumstances.

Meanwhile, water comes out of my A/C unit and makes the ground around it all muddy.

How am I running circles around the water efficiency of a huge AI data center, with an overall negative water consumption?

  • Fiery@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    16
    arrow-down
    2
    ·
    5 days ago

    What I don’t get is how the water is “consumed”, it’s not like it’s gone right? It evaporates and then just comes back down as rain surely?

    Same with water consumption of a sweater or a steak.

    There probably is some good reason for measuring it like that but conceptually I don’t get it.

    • cRazi_man@europe.pub
      link
      fedilink
      arrow-up
      18
      ·
      edit-2
      5 days ago

      Even though there is loads and loads of water on the planet, the amount of fresh/drinkable/usable/accessible water is tiny. This water evaporates and rains back down, but this will most likely fall over the ocean, or on land and go into the ground, or into some other unusable area/form.

      Water suitable for human use is a scarce commodity and needs to be preserved. Of all the water lost to the atmosphere from server cooling systems, almost none of it can be recaptured again.

          • Darkassassin07@lemmy.ca
            link
            fedilink
            English
            arrow-up
            4
            ·
            5 days ago

            Collect and condense the hot water vapor, concentrate the heat until you’ve got steam; then pump it through a steam turbine recapturing that energy as electricity.

            I’m sure there’s some difficulties and nuances I’m not seeing right away, but it would be nice to see some sort of system like this. Most power plants generate heat, then turn that into electricity. Data centers take electricity and turn it back into heat. There’s gotta be a way to combine the two concepts.

            • SaltSong@startrek.website
              link
              fedilink
              English
              arrow-up
              3
              ·
              4 days ago

              The difficulty is, to put it in very simple terms, is that physics doesn’t allow that. The less simple explanation is a thermodynamics textbook, and trust me, you don’t want that.

              Everything generates heat. Everything. Everything. Anything that seems to generate “cold” is generating more heat somewhere else.

              • Darkassassin07@lemmy.ca
                link
                fedilink
                English
                arrow-up
                2
                ·
                4 days ago

                Yeah, thermodynamics are a thing. I’m not trying to claim some free energy system saying you could power the whole data center; but if you could re-capture some of the waste heat and convert it back into electricity, putting that energy to work instead of just venting to atmosphere, it could potentially help offset some of the raw electrical needs. An efficiency improvement, that’s all.

                • SaltSong@startrek.website
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  3 days ago

                  I’m an actual engineer with a degree and everything, although this is not my area of expertise, it’s one I’m familiar with.

                  They could do something like you suggest, but every step becomes more expensive and less effective. The exhaust from a coal fired power plant is still plenty hot, and more energy could be extracted from it. But it requires more and more to make less and less.

                  The curse of every engineer is to see a way to them every waste stream into a useful product, but not being able to do so profitably. (Which means no-one will approve the project)

                • dubyakay@lemmy.ca
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  4 days ago

                  Yeah, if we just take heat pumps for example, or even cpu water coolers, the heat is carried away from where it’s hot to somewhere where it can be radiated off and equilibrium of heat conducting material and surrounding occurs.
                  You can bet your ass that these US data center are just brute forcing heat exchange via evaporation instead to make the initial investment cheaper. It’s the equivalent to burning coal instead of straight up going for the renewable but initially more costly option when it comes to energy production.

                • Goodeye8@piefed.social
                  link
                  fedilink
                  English
                  arrow-up
                  5
                  arrow-down
                  1
                  ·
                  5 days ago

                  And what happens to the heat? Heat can’t just magically disappear which means water can’t cool without heat being able to dissipate somewhere. So it would have to dissipate heat into the dome. What happens to the dome if you keep pumping hot vapor into the dome? It heats up. If it heats up the water vapor stops cooling and the entire cooling system stops working.

                  I’m not saying it couldn’t work in theory, I’m saying it doesn’t work in practice because the dome would have insanely big, maybe the size of small nation big.

      • Melvin_Ferd@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        5 days ago

        Hadley cells bring them inland does it not where it condenses and rains flowing back towards the ocean where it again evaporates and travels inland rains and goes back to the ocean.

        • 4am@lemmy.zip
          link
          fedilink
          arrow-up
          6
          ·
          5 days ago

          I mean eventually yeah, but not fast enough for you to keep using it that way.

          Especially now that air holds more moisture since rising temperatures keep the atmosphere warmer and rain is less frequent.

    • Ziggurat@jlai.lu
      link
      fedilink
      arrow-up
      6
      ·
      5 days ago

      This is the complicated part with water consumption, saving water in the Netherlands won’t make rain in Morroco.

      However, there is only so many rain water stored in the ground at a given time and brought by the rivers. This water need to be used mostly for agriculture, then human consumption, and finally industry. Once it’s back in the cloud we don’t fully know where it will fall again, let alone if it’s polluted.

      Sure it’s a renewable ressource, the problem start when you the water faster than the rate at which it renews, especially during summer. In Europe the problem will be even worse with the global warming. The alpine glacier are disappearing meaning that we’ll loose a major water reserve for summer