AI is eating the planet. Enormous electrical and cooling demands. [View all]
Last edited Tue Sep 24, 2024, 10:39 AM - Edit history (1)
https://futurism.com/the-byte/environment-openai-chatgpt
The Environmental Toll of a Single ChatGPT Query Is Absolutely Wild
Say one out of every ten working Americans were using ChatGPT just once a week to write an email. In Ren's estimate, over a one year period that would mean ChatGPT would guzzle 435 million liters of water and burn 121,517 megawatt-hours of power, which translates into all the water drunk up by every household in Rhode Island for a day and a half and enough electricity to light up all the households in Washington DC for 20 days.
And that's just today's usage. With big tech so confident in the explosive potential of AI that Microsoft is looking to bring an entire nuclear plant back online to fuel its AI datacenters, those figures could come to look laughably low.
Thirst Traps
The reason ChatGPT consumes so much water is due to the fact that AI data centers emit tons of heat when running calculations. In order to cool these facilities, they require a tremendous amount of water to bring down the temperatures coming from these servers. In places where electricity is cheap or where there's water scarcity, AI data centers use electricity to run air conditioners to cool their servers.
That can be a burden on infrastructure. Places like Arizona and Iowa are already feeling the tension between serving the needs of the public and the insatiable water thirst and power hunger of AI data centers, which bring tax revenue and jobs to these locales.
Edited to focus only on the problem, not the problems with solutions to the problem.
Quite simply, the gigantic waste of energy to by definition replace human intelligence with machinery, is entirely avoidable. The grids are buckling under the load of air conditioning, and people die from the heat.