AI Data Centers’ Water Thirst Raises Global Scarcity Alarms


When AI supercenters rise, their water demand can drain communities dry.

While artificial intelligence (AI) reshapes the way America does business, the number of data centers being built to meet its expanding computational demands has kicked off a construction boom.

Millions of gallons of water are needed for cooling these new data centers, a demand that has risen in lockstep with the expansion of AI support facilities.

The amount of water needed to power the data center building bonanza has triggered concerns about water supplies and groundwater safety in arid and water-stressed cities, where many of the complexes are being built.

Sergio Toro, CEO of market intelligence group Aterio, shared research with The Epoch Times that shows there are 1,827 active data centers in the United States, with another 1,726 announced and 419 currently under construction.

Hundreds of the new centers are being planned or built in areas suffering from water scarcity or prolonged drought, prompting alarm from those working in sustainable urban development and environmentalism.

Based on the findings Toro shared, 1,082 data centers are being planned or built across 10 states that are experiencing some degree of water stress.

In states grappling with acute water stress, such as Nevada, Arizona, Texas, Utah, California, and Colorado, 437 data centers are planned or are currently under construction.

The amount of water used in data centers depends on the facility type, which generally falls under one of two categories.

Hyperscale data centers are large facilities used by cloud service providers and internet companies, demanding huge amounts of electricity and sometimes spanning millions of square feet.

Non-hyperscale data centers—also known as co-location data centers—are facilities where equipment, space, and bandwidth are rented to either wholesale or retail customers.

On average, non-hyperscale facilities use roughly 6.57 million gallons of water per year. By comparison, hyperscale centers—the kind required to power AI—use an estimated 200 million gallons per year.

However, it’s not just the volume of water that’s causing concern but also the risk of contamination from cooling system additives leaching into groundwater, Steve Rosas, president and project director at Omega Environmental Services, told The Epoch Times.

“We’ve remediated sites where industrial cooling operations contaminated soil and groundwater with biocides, corrosion inhibitors, and scale preventers, which are chemicals that persist in the environment long after facilities close,” he said.

Quenching the Thirst

Rosas said that data centers need to undergo comprehensive environmental impact assessments before construction, not “reactive remediation,” which he said can cost millions of dollars to perform and decades to complete.

He said that contamination from per-and polyfluoroalkyl substances or PFAS—better known as “forever chemicals” because they are persistent in the environment—is at the top of his list of concerns with data center expansion.

“Many cooling system components and fire suppression foams contain PFAS chemicals that we’re now investigating at more than 600 California water sites,” Rosas said. “These ‘forever chemicals’ bioaccumulate and cause hormonal effects even at extremely low concentrations, making remediation extremely expensive and technically challenging.”

Rosas said regulatory compliance often gets overlooked until it’s too late.

Because of their non-flammability and their ability to be used across a wide temperature spectrum, forever chemicals are added to certain types of cooling systems as refrigerants. Most two-phase cooling systems used in data centers contain PFAS, according to Data Center Frontier.

Steffen Lehmann, a professor of architecture and urbanism at the University of Nevada–Las Vegas, shares Rosas’s concern about the growing number of data centers and their potential impact on local water supplies.

“In areas like the outskirts of Las Vegas, where land is relatively affordable, several large data centers are planned or under construction,” Lehmann said.

“These facilities will require substantial energy and water for cooling. Conventional large-scale data centers can consume up to 5 million gallons of water per day, equivalent to the daily water use of a town with 20,000 to 50,000 residents.”

Even relatively simple AI workloads require a lot of water for cooling. For example, AI platform ChatGPT consumes the equivalent of a 500ml bottle of water for between 10 to 50 medium-length responses, according to joint research from the University of California–Riverside and the University of Texas at Arlington.

ChatGPT is visited approximately 5.24 billion times each month, according to May data by Semrush.

Data complexes use water in cooling systems but also indirectly through non-renewable electricity generation.

A 2021 report on data center water usage noted that the volume of water consumed for cooling that came from a potable water source—not recycled or reclaimed—accounted for as much as 57 percent in some instances.

It’s difficult to pinpoint exactly how many data centers are being built exclusively to support AI’s growing demands.

A 2024 McKinsey & Co. analysis predicted that the need for AI-capable support locations will increase at a rate of 33 percent per year until 2030.

This factor is a key driver of concern for people such as Lehmann, who is watching data center expansion occur in a highly water-stressed portion of southern Nevada. Despite being America’s driest state, the Silver State is experiencing massive growth in data centers.

Toro’s research indicates there are 44 new facilities in Nevada that have been announced or are under construction.

“The rapid expansion of data centers has created a competitive tension between population growth, suburban development, and the construction of these energy and water-intensive facilities,” Lehmann said.

He believes that transparency from the developers building the facilities and the companies that own them is crucial to address mounting concerns over water usage.

Some tech insiders share his concerns.

“One issue people don’t always think about is the impact on local infrastructure. Pulling large volumes of water can lower groundwater levels, harm local wildlife, and even cause competition with agriculture,” Arnold Pinkhasov, a software engineer at tech accelerator OSLabs, told The Epoch Times.

Pinkhasov said increased water usage by data centers can have effects such as increasing wear and tear on municipal water systems that weren’t designed for such high-volume industrial usage.

“Another overlooked issue is thermal pollution, where water used for cooling is returned to the environment at a higher temperature, which can affect ecosystems in rivers and lakes,” he said.

Regulation

Many working on the front lines of data center expansion are trying to address the intersection of growth and water usage before it becomes critical.

“Local factors, including water availability, humidity, and climate, are key considerations in the cooling systems and strategies that data centers employ to maximize efficiencies and minimize their water footprint,” Jon Hukill, communications director at the Data Center Coalition, told The Epoch Times.

The coalition is the trade association for the data center sector.

Hukill said that overall, the industry is committed to responsible water use in a legislative environment where regulations vary widely from state to state.

In Virginia—home of the world’s largest data center market, there is currently no statewide regulation on water usage at data centers. The state, which is home to more than 150 data centers in Northern Virginia alone, illustrates the tension between the need for regulation and the push for progress.

Localities are making their own decisions to determine what, if any, rules or inspections will apply to data center water usage.

Virginia House Bill 1601 would have mandated environmental impact assessments on surface and groundwater at proposed data center facilities, but the measure was vetoed by Virginia Gov. Glenn Youngkin in May.

Youngkin said the legislation would limit local discretion and create “unnecessary red tape” for new data centers.

“While well-intentioned, the legislation imposes a one-size-fits-all approach on communities that are best positioned to make their own decisions,” Youngkin said when he vetoed it.

Hukill said data centers are actively investing and deploying technologies to help reduce their water footprint. Some of those innovations include waterless cooling, closed-loop systems, and using recycled or reclaimed water.

“Many Data Center Coalition members are adopting water-positive commitments alongside these significant investments,” he said.

“In fact, 83 percent of data centers in Virginia use the same amount of water or less than the average larger office building.”

Preserving Resources

Amazon Web Services (AWS) stated that it’s “doubling down on preserving freshwater resources,” as it works to reduce data center water consumption.

AWS is the biggest name in cloud computing, with a vast global network of data centers. The retail and tech giant also has extensive facilities and support dedicated to AI and machine learning workloads. The company has committed to the goal of being “water positive” by 2030 in its data centers, meaning that it intends to return more water to communities than it uses in its direct operations.

“AWS is focusing on using more sustainable sources of water, such as water recycling or rainwater harvesting, wherever possible,” a spokesperson for AWS told The Epoch Times.

AWS has previously invested in water recycling infrastructure in the western United States, including California, and is expanding those water recycling efforts, according to the spokesperson.

“Moving forward, we’ll continue to pursue new opportunities … to enable recycled water use for our data centers where feasible,” the spokesperson said.

While the implementation of these methods depends on local infrastructure availability, regulations, and water quality standards, the AWS spokesperson said the cloud giant’s teams conduct thorough assessments at each location to determine sustainable water management strategies.

“These assessments consider, among other things, source water conditions, existing infrastructure capacity, and projected community needs,” the spokesperson said.

Other big tech players have announced efforts to reduce water consumption at their sprawling data center complexes. Google uses reclaimed or non-potable water at more than 25 percent of its data centers. Last August, Microsoft launched a new design that doesn’t require water for cooling AI-related workloads at its facilities.

When asked about progress toward the company’s goal of being water positive by 2030, the AWS spokesperson said the company’s data centers were 53 percent of the way there, up from 41 percent in 2023.

“In the Americas, our current fleet of data centers use no water for cooling for 90 percent of the year,” the spokesperson said. “In addition, our water replenishment projects across the U.S. and globally are expected to help replenish more than 9 billion liters of water each year to the environment and our communities once every project is completed.”

In June, AWS announced plans to expand water recycling efforts at data centers in more than 120 of its locations throughout the United States.

https://www.theepochtimes.com/article/is-there-enough-water-to-quench-the-thirst-of-ai-super-data-centers-5881066?utm_source=partner&utm_campaign=ZeroHedge


You can return to the main Market News page, or press the Back button on your browser.