The vast data centers that power artificial intelligence are so energy hungry that they’re heating up their surroundings, according to new research. It’s an alarming finding given the number of data centers is predicted to explode over the next few years.

  • Appoxo@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    16 hours ago

    But un a radius of 6 mi? That sounds a bit high.
    More close to a city with lots of concrete to store the heat.

    • AHemlocksLie@lemmy.zip
      link
      fedilink
      English
      arrow-up
      27
      arrow-down
      1
      ·
      edit-2
      15 hours ago

      Large data centers can consume over 100 MW of power. Almost ALL the energy a computer consumes is turned into heat, like well over 90%. A home AC unit pulls a little under 1 kW, and I think heating is about the same so that’s equivalent to heating over 100,000 homes, except those homes will eventually get warm and stop running the heat. The data center churns all day, every day. Given that, it may be equivalent to all the heat put out in more like 250,000 homes. Data centers produce an ABSURD amount of heat.

      Edit: and keep in mind, that’s HOMES, not people. Average people per household in the US is 2.5, so that’s heating for over 600,000 people.

      • filcuk@lemmy.zip
        link
        fedilink
        English
        arrow-up
        7
        ·
        12 hours ago

        Sorry to nitpick but doesn’t 100% of it end up as heat? Vibrations, light, sounds, radio waves- all a tiny fraction of the power are also eventually absorbed by the environment.
        That was my understanding at least

          • AHemlocksLie@lemmy.zip
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            10 hours ago

            No, it’ll all happen inside the data center. The problem with that is computers hate all that heat, so they pipe it all away and dump it outside to the best of their ability. The data center may not be 6 miles wide, but then the wind starts blowing the heat around. Hell, even on a perfectly still day, heat would radiate out. They’re making enough heat to keep every single home in a city of 500,000+ people comfortable in winter, so it’s either that or the data center turns into the world’s largest oven.

        • AHemlocksLie@lemmy.zip
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          10 hours ago

          My understanding is that some tiny portion, like 1-2%, is actually used in a meaningful way to do calculations to do what you want, but that could incorrect. Or it may be that that tiny portion still inevitably turns to heat, just indirectly somehow. I’m not sure, though, you could be right.

          • black0ut@pawb.social
            link
            fedilink
            English
            arrow-up
            4
            ·
            9 hours ago

            All of the energy that does calculations gets turned into heat. The only energy that doesn’t get directly turned into heat is the mechanical energy produced by the fans (which ends up turning into heat), and the electromagnetic radiation (which also ends up turning into heat).

            If the calculations didn’t convert energy into heat, a computer would essentially use no power. You can think of a computer like a really complex wire. The power consumption you see is actually the heat loss of that wire. The less heat you lose, the more efficient the wire is.

          • ayyy@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            2
            ·
            8 hours ago

            How did you go from 10% to 1-2%? Please don’t use such precise figures when the source is clearly your ass.

            • AHemlocksLie@lemmy.zip
              link
              fedilink
              English
              arrow-up
              1
              ·
              5 hours ago

              I said over 90% because I couldn’t remember the correct figure. I wanted to be as accurate as I could be with full confidence. If you think something I said was inaccurate, feel free to correct me, but so far, it looks like I was right but could have been more precise if I’d wanted to spend even more time fact checking.

              • ayyy@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                ·
                4 hours ago

                No, you are wrong. Energy is always preserved, computers turn everything that goes into them into heat.

                • AHemlocksLie@lemmy.zip
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  4 hours ago

                  I would think there’d be some very minor bleeds at a minimum. Like the fan churns the air, and that definitely turns a lot of its energy into heat, but someone of that energy is spent on actual movement, not simply heating air particles. But without more precise figures for that, “well over 90%”, or whatever my exact wording was, is true and precise enough to make my point. I could have looked up a more precise figure, but it wouldn’t have significantly impacted the very rough math that was only intended to approximate the truth well enough to illustrate the point.

        • Bane_Killgrind@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          3
          ·
          4 hours ago

          Read those numbers again. A quarter million homes worth of heat, so like 100 homes tall stacked into the same square footage? Or at least 10, I’m not checking my math.

        • AHemlocksLie@lemmy.zip
          link
          fedilink
          English
          arrow-up
          3
          ·
          4 hours ago

          Yeah, a much smaller heat source will produce a much smaller heat bubble. 100 MW is an amount of power that’s difficult to comprehend. A home in the US consumes an average of ~11 MW in an entire year. Every single hour that a 100 MW data center operates, it consumes enough power for a little over 9 homes to run ALL YEAR. Every single day, enough power for almost 225 homes to run for a YEAR. The heat output of a data center is orders of magnitude higher than a parking lot.

        • Soup@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          6 hours ago

          A large parking lot takes in heat from the sun and the releases it when the surrounding air becomes cooler. It heats the air, the air rises, new air comes in. A data centre produces heat all on its own, all the time, without ever stopping. It’s the difference between putting a cast iron pan in the sun and just straight up lighting a fire and keeping it burning day and night.