AI programs consume large volumes of scarce water

UCR study finds that keeping servers powered & cool at cloud data processing centers has high water costs

Every time you run a ChatGPT artificial intelligence query, you use up a little bit of an increasingly scarce resource: fresh water. Run some 20 to 50 queries and roughly a half liter, around 17 ounces, of fresh water from our overtaxed reservoirs is lost in the form of steam emissions.

Such are the findings of a University of California, Riverside, study that for the first time estimated the water footprint from running artificial intelligence, or AI, queries that rely on the cloud computations done in racks of servers in warehouse-sized data processing centers.

Data processing centers consume water by using electricity from steam-generating power plants and by using on-site chillers to keep their servers cool. Graphic image by Evan Fields/UCR

Google’s data centers in the U.S. alone consumed an estimated 12.7 billion liters of fresh water in 2021 to keep their servers cool — at a time when droughts are exacerbating climate change — Bourns College of Engineering researchers reported in the study , published online by the journal arXiv as a preprint. It is awaiting its peer review.

20 to 50 ChatGPT queries cost roughly a half liter of fresh water from overtaxed reservoirs — in the form of steam emissions.

Shoalei Ren, an associate professor of electrical and computer engineering and the corresponding author of the study, explained that data processing centers consume great volumes of water in two ways.

First, these […]

Full article: news.ucr.edu

Summary
AI programs consume large volumes of scarce water
Article Name
AI programs consume large volumes of scarce water
Description
UCR study finds that keeping servers powered & cool at cloud data processing centers has high water costs.
Author
Publisher Name
UNIVERSITY OF CALIFORNIA, RIVERSIDE