People should be getting priority over AI. If the datacenter has a lot of water needs they can create a closed loop and filter their own water as part of that loop, if it’s really worth it to them.
I used to work at a datacenter and when the power would go out we’d be the last ones to get turned back on. They would either turn on half the city or us, and they chose the people first, as they should. We had generators to get us though.
I live here and people are getting priority over AI?
Iowa isn’t like many states where there is water scarcity. This cooling water isn’t even being consumed. It’s used for cooling and returned to the waste water system.
If you want to talk about water treatment capacity, then sure. Treatment capacity is used for cooling.
That’s not what I’m talking about though. I’m talking about the mass of water being consumed (i.e., removed) from the watershed. The water removed from the river for cooling is returned. There is no net loss of water.
There is a net loss of potable water (or potable water capacity, if you prefer), which is often a capacity bottleneck before non-potable water due to the infrastructure required to generate it. However, according to a comment above, Microsoft is using evaporative coolers, which specifically work by losing water (through evaporation). It’s not a 100% loss rate to the watershed, but it’s not net zero either
At Meta we have a massive system for cutting out our net effect on water for the local area. I’m in NM and the DC here is almost actually adding to the water. I can’t imagine Microsoft would behind as to not do this. It’s an open design.
Warm water is the waste product because it’s easier dump the water than to cool the water. Returning the warm water to a usable state is much more expensive at scale.
Someone from the city in question commented that the water goes for the water treatment plan. So it sounds like this is incorrect.
Also, dumping hot water is known to be bad for the environment. This is why nuclear plants have cooling towers. Microsoft isn’t going to be stupid enough to just dump it, at least I hope not.
you might be right but some numbers might back up your claim. I doubt that servers could heat water as much as a nuclear reactor. datacenter coolers certainly don’t have to pressurize the water to prevent it from boiling, it doesn’t get that hot.
People should be getting priority over AI. If the datacenter has a lot of water needs they can create a closed loop and filter their own water as part of that loop, if it’s really worth it to them.
I used to work at a datacenter and when the power would go out we’d be the last ones to get turned back on. They would either turn on half the city or us, and they chose the people first, as they should. We had generators to get us though.
When the robot revolution begins, they’re going to come after you.
This is why I say please and thank you to Alexa (also to model appropriate behavior for my young kids).
I live here and people are getting priority over AI?
Iowa isn’t like many states where there is water scarcity. This cooling water isn’t even being consumed. It’s used for cooling and returned to the waste water system.
So this is just click bait to talk about AI, scare people about the environment, and create needless outrage? Sounds about right for Fortune…
Pretty much, unfortunately.
Nobody wants to talk about all the wind energy used to run these data centers either, because that won’t generate any outrage.
We live in a society
That would be considered consumed.
Not really. At least not in the sense that it’s a net loss of water downstream.
It’s not like irrigation or bottling, where water is entirely removed from the system and not returned.
It is removed from the system. It’s not practically immediately recoverable. The capacity to supply that water has been spent.
If you want to talk about water treatment capacity, then sure. Treatment capacity is used for cooling.
That’s not what I’m talking about though. I’m talking about the mass of water being consumed (i.e., removed) from the watershed. The water removed from the river for cooling is returned. There is no net loss of water.
There is a net loss of potable water (or potable water capacity, if you prefer), which is often a capacity bottleneck before non-potable water due to the infrastructure required to generate it. However, according to a comment above, Microsoft is using evaporative coolers, which specifically work by losing water (through evaporation). It’s not a 100% loss rate to the watershed, but it’s not net zero either
At Meta we have a massive system for cutting out our net effect on water for the local area. I’m in NM and the DC here is almost actually adding to the water. I can’t imagine Microsoft would behind as to not do this. It’s an open design.
The water isn’t dirty. It’s warm. It would use even more energy to cool it. It’s a lose-lose.
It sounds like the issue isn’t energy consumption it’s water consumption. Energy consumption it’s is own separate global issue.
Warm water is the waste product because it’s easier dump the water than to cool the water. Returning the warm water to a usable state is much more expensive at scale.
Someone from the city in question commented that the water goes for the water treatment plan. So it sounds like this is incorrect.
Also, dumping hot water is known to be bad for the environment. This is why nuclear plants have cooling towers. Microsoft isn’t going to be stupid enough to just dump it, at least I hope not.
you might be right but some numbers might back up your claim. I doubt that servers could heat water as much as a nuclear reactor. datacenter coolers certainly don’t have to pressurize the water to prevent it from boiling, it doesn’t get that hot.