At dusk, the buildings in Loudoun County, Virginia, reveal themselves when you drive by. Lined with chain-link fencing and substations that appear almost industrial-gothic in the orange light, they are enormous, windowless, hum-quiet from the road but loud up close.
This area is known to the locals as “Data Center Alley.” It seemed like a curiosity a few years ago. It now has the appearance of a frontier town that expanded too quickly for its own water table.
| Category | Detail |
|---|---|
| Primary Issue | Energy and water consumption by hyperscale AI data centers |
| Geographic Hotspots | Northern Virginia, Phoenix, Dallas, central Washington, rural Iowa |
| U.S. Data Center Share of Electricity (Projected, 2026) | Roughly 6% of national consumption |
| Major Operators | Amazon Web Services, Microsoft, Google, Meta, Oracle |
| Average Water Use per Large Facility | Hundreds of millions of gallons annually for cooling |
| Single LLM Training Footprint | Thousands of MWh, hundreds of tons of CO₂ |
| Most Strained Resource Locally | Freshwater in arid states like Arizona and Nevada |
| Common Power Source | Mix of natural gas, coal (in some regions), and renewables |
| Regulatory Status | Largely fragmented; few federal disclosure rules |
| Emerging Solutions | Liquid cooling, off-peak training, on-site solar, small modular reactors |
There is a body to the American AI boom. It doesn’t actually reside in the cloud. It resides in locations like this, as well as in Phoenix’s suburbs and the renovated warehouses outside of Dallas, where the air conditioning is more demanding than the servers. Investors appear to think that the buildout is still in its early stages. Based on their grid forecasts, the utilities concur. It’s remarkable how infrequently the topic of how much all of this really costs—not in dollars, but in kilowatt hours, gallons, and air quality close to communities that didn’t request to host the future—comes up.
Once you sit with the numbers, they become unsettling. A single large language model’s training can consume thousands of megawatt hours and release hundreds of tons of carbon dioxide, which is equivalent to the yearly emissions of several hundred American homes put together.

The daily task of responding to a billion prompts, or inference, has subtly grown to be the greater expense. Data centers in the United States are expected to use nearly 6% of the country’s electricity by 2026. It’s not a rounding error. That is a tiny nation in Europe that is connected to the wall.
The part that doesn’t make news but probably ought to is the water. To prevent the chips from melting, hyperscale facilities employ evaporative cooling, in which freshwater literally rises into the atmosphere. Communities in Arizona are witnessing new server farms negotiate water deals while locals are asked to let their lawns die, as decades-long droughts have already forced cities into awkward rationing discussions. Speaking with those who live close to these locations gives the impression that something is wrong with the social compact. Somewhere else, the advantages flow. The local thirst persists.
The industry may find a solution on its own schedule. Google and Microsoft both release sustainability reports. Liquid cooling, which uses significantly less water, has been aggressively adopted by some operators. A few are entering into contracts for nuclear power, including small modular reactors that could completely alter the calculations if they come to pass. Solar has been purchased by Amazon on a never-before-seen scale. This is all real. However, none of it is moving quickly enough on its current course.
A new reactor is more glamorous than what might actually solve it. The first step is disclosure, which is actual, audited, facility-level reporting on power and water consumption. The EPA already mandates this type of environmental data for other heavy industries. It continues with more intelligent siting guidelines that prevent concentrating loads in watersheds that are already under stress. It includes time-of-day training,
in which models are trained during the cleanest grid conditions. Furthermore, the federal government must likely approach AI infrastructure as a public-interest project with public-interest safeguards, just as it did with the interstate highway system.
As we watch this play out, it seems like we’re following a long-standing American pattern. Build first, measure later, and eventually offer an apology. In the 1980s, the semiconductor industry did something similar. Fracking did the same. The electric grid wasn’t built for the AI boom, which is larger and faster. The cost of the technology might be justified. However, the places that gain the least should not bear the brunt of the cost, which should at least be apparent.
