Data centers are draining resources in water-stressed communities

Data centers are draining resources in water-stressed communities

Eric Olson, Anne Grau, and Taylor Tipton write:

The rapid growth of the technology industry and the increasing reliance on cloud computing and artificial intelligence have led to a boom in the construction of data centers across the United States. Electric vehicles, wind and solar energy, and the smart grid are particularly reliant on data centers to optimize energy utilization. These facilities house thousands of servers that require constant cooling to prevent overheating and ensure optimal performance.

Unfortunately, many data centers rely on water-intensive cooling systems that consume millions of gallons of potable (drinking) water annually. A single data center can consume up to 5 million gallons of drinking water per day, enough to supply thousands of households or farms.

The increasing use and training of AI models has further exacerbated the water consumption challenges faced by data centers.

Machine learning, particularly deep learning models, requires significant computational power, which generates a lot of heat. As a result, data centers housing these machine learning servers need even more cooling to maintain optimal performance and prevent overheating. Graphics processing units, which are commonly used to accelerate machine learning workloads, are known for their high energy consumption and heat generation.

As the demand for machine learning applications grows across various industries, the need for data centers equipped to handle these workloads will continue to rise, putting additional pressure on local water resources. According to a report by McKinsey & Company, data center electricity consumption in the United States is expected to increase from 17 gigawatts in 2022 to 35 GW by 2030, a 100% increase. [Continue reading…]

Comments are closed.