When we talk about the environmental cost of digital technology, carbon emissions get most of the attention. But there is another resource being consumed at an alarming scale, one that is far harder to replace: fresh water.
Data centres are thirsty
Data centres generate enormous amounts of heat from thousands of servers running continuously. The most common way to cool them is evaporative cooling, which absorbs heat by evaporating water. It is effective, but it consumes vast quantities of fresh water.
US data centres collectively consumed 449 million gallons of freshwater per day as of 2021, and that figure has grown since. Roughly 80% of that water evaporates and is permanently lost. It is not discharged or recycled. It is gone. A single large data centre can consume 3-5 million gallons per day, equivalent to the daily water usage of a small city.
It is drinking water
Data centres primarily draw from freshwater sources: surface water, groundwater, and piped municipal water. Only 3% of Earth's water is freshwater, and only 0.5% is accessible and safe for human consumption. Salt water corrodes cooling equipment, so it is almost never used. Some operators are experimenting with recycled water, but most still rely on potable municipal supplies.
The competition with human needs is already tangible. In some regions, data centre water consumption has strained local supplies, drawing criticism from conservation groups who argue it prioritises tech infrastructure over community needs.
The location problem
Many data centre clusters are being built in water-scarce regions. Arizona, Nevada, and parts of Texas offer cheap land and low energy costs, but they are also among the most water-stressed areas in the US. Data centres in Texas alone are projected to use 49 billion gallons of water in 2025, potentially rising to 399 billion gallons by 2030, equivalent to drawing down Lake Mead by more than 16 feet in a single year.
The hidden water usage of your electricity
Direct cooling is only part of the story. The largest water cost associated with data centres is actually electricity generation itself.
Fossil fuel and nuclear power plants also rely on freshwater, using it to generate the steam that turns their turbines. Coal plants require approximately 19,185 gallons of freshwater per MWh. Natural gas plants need about 2,800 gallons per MWh. Even hydroelectric power involves freshwater loss from reservoir evaporation.
A data centre powered by fossil fuels has a massive hidden water footprint upstream at the power plant, on top of its own cooling needs. Renewables like solar and wind use essentially zero water in generation, which is another reason the transition to renewables matters far beyond carbon.
The hidden water usage of your chips
Every chip inside a server, laptop, or phone consumed thousands of gallons of water during fabrication. A typical semiconductor plant uses about 10 million gallons of ultrapure water per day, comparable to 33,000 US households. And producing one gallon of ultrapure water requires about 1.5 gallons of regular tap water, so the conversion itself is wasteful.
Water and carbon: the trade-off
Water usage has its own carbon footprint. Municipal systems require energy to extract, treat, pump, and process water, roughly 0.3-0.5 kg CO2 per cubic metre.
But the bigger story is the trade-off. Water-cooled data centres use about 10% less energy and emit roughly 10% less carbon than air-cooled alternatives. More water means less electricity. The optimal choice depends on the local grid's carbon intensity and the local water stress. There is no universal right answer.
Water positive pledges
Microsoft, Google, and Amazon have all pledged to become water positive by 2030, meaning they aim to replenish more water than they consume. However, replenishment projects are often located far from where water is actually extracted. The local aquifer being drained does not benefit from a restoration project in another region.
What you can do
The water footprint of digital technology is harder to influence individually than the carbon footprint. But the principles are the same. Reducing digital waste means fewer servers running, less cooling, and less water consumed. Supporting renewable energy reduces the upstream water footprint of electricity generation. And where your cloud data is stored matters: a data centre in a water-rich region with clean energy has a fundamentally different footprint than one in an arid region running on fossil fuels.
Every file you delete, every duplicate you remove, every unnecessary process you shut down takes pressure off the systems that consume these resources. It is not just about carbon. It is about water, minerals, land, and the full environmental cost of keeping the digital world running.