• CorvidCawder@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      25 minutes ago

      From the blog you quoted yourself:

      Despite improving AI energy efficiency, total energy consumption is likely to increase because of the massive increase in usage. A large portion of the increase in energy consumption between 2024 to 2023 is attributed to AI-related servers. Their usage grew from 2 TWh in 2017 to 40 TWh in 2023. This is a big driver behind the projected scenarios for total US energy consumption, ranging from 325 to 580 TWh (6.7% to 12% of total electricity consumption) in the US by 2028.

      (And likewise, the last graph of predictions for 2028)

      From a quick read of that source, it is unclear to me if it factors in the electricity cost of training the models. It seems to me that it doesn’t.

      I found more information here: https://www.technologyreview.com/2025/05/20/1116327/ai-energy-usage-climate-footprint-big-tech/

      Racks of servers hum along for months, ingesting training data, crunching numbers, and performing computations. This is a time-consuming and expensive process—it’s estimated that training OpenAI’s GPT-4 took over $100 million and consumed 50 gigawatt-hours of energy, enough to power San Francisco for three days.

      So, I’m not sure if those numbers for 2023 paint the full picture. And adoption of AI-powered tools was definitely not as high in 2023 as it is nowadays. So I wouldn’t be surprised if those numbers were much higher than the reported 22.7% of the total server power usage in the US.