Imagine this: you flip on Netflix, ask your phone a question, or fire up the latest AI app, and somewhere in a field in Virginia or Texas, thousands of servers instantly wake up and start guzzling electricity like there’s no tomorrow.
That “somewhere” is about to get a lot bigger. A lot bigger.
Recent forecasts now predict that by 2035 the United States could need an astonishing 106 gigawatts of power just for data centers. To put that in perspective, the entire country only had around 25 GW of operating data center capacity last year. We’re talking about quadrupling demand in little more than a decade.
The AI Boom Is Eating Electricity for Breakfast
Everyone saw the AI hype coming, but almost nobody saw the power bill coming. Training and running large language models isn’t like running a few laptops. A single modern AI training cluster can pull hundreds of megawatts – enough to power a medium-sized city – and once the model is live, the inference queries keep the meters spinning 24/7.
Add cloud computing growth, crypto mining (yes, it’s still around), streaming, and good old-fashioned enterprise IT, and you start to understand why the new 106 GW forecast is actually 36% higher than the same analysts predicted only eight months ago. The projects keep getting announced bigger and faster than anyone expected.
Where Is All This New Capacity Going?
For years the data center world revolved around a handful of hubs: Northern Virginia (the king), Dallas, Silicon Valley, Atlanta, Chicago. Nice big fiber pipes, cheap land nearby, relatively cooperative regulators.
That map is exploding outward. Developers are now chasing power wherever they can find it. Rural counties in South Carolina, exurban Pennsylvania, the Texas Hill Country, even the Upper Peninsula of Michigan are suddenly on the shortlist because they sit near existing transmission lines and, crucially, near power plants that can actually deliver hundreds of megawatts without collapsing the local grid.
- South through Virginia into the Carolinas
- North and west from Chicago along Lake Michigan
- Deep into central Ohio and Indiana
- Central and West Texas (ERCOT territory)
- Gulf Coast from Mississippi to Louisiana
The three grid operators likely to feel the most pain in the next five years are PJM (Mid-Atlantic), MISO (Midwest), and ERCOT (Texas). One forecast suggests PJM alone could see 31 GW of new data center load by 2030 – roughly the same amount of new generation the region expects to bring online in the same period. You don’t need a PhD in electrical engineering to see the potential problem there.
Are These Forecasts Overblown?
Here’s where things get interesting. Not everyone buys the 106 GW story.
Some analysts warn that speculative projects and duplicate permit filings are inflating the numbers. Developers routinely file for the same campus in three neighboring counties hoping one says yes. Chip shortages, cooling equipment delays, and financing hurdles could kill dozens of announced projects before they ever pour concrete.
I’ve watched this movie before. Remember 1999 when every telecom carrier was burying fiber like mad and analysts predicted endless bandwidth demand? A lot of that capacity still sits dark today. Could the same thing happen with hyperscale data centers?
Maybe. But there are two big differences this time:
- Actual customer contracts are already signed for a huge chunk of the announced capacity (think Microsoft, Google, Amazon, Meta).
- AI model size and usage keep growing faster than almost anyone predicted even 18 months ago.
In other words, the demand isn’t theoretical anymore.
The Nuclear Comeback Nobody Saw Coming
Here’s perhaps the most fascinating subplot: tech giants are starting to act like utilities.
Microsoft has signed deals to restart the Three Mile Island plant (yes, really). Amazon is buying an entire data center campus next door to a Susquehanna nuclear station in Pennsylvania. Google is exploring small modular reactors. When the biggest companies in the world decide coal retirements have gone too far and start bringing mothballed nuclear units back online, you know the power situation is serious.
Renewables will play a role – lots of solar farms are being paired with battery storage next to new data centers – but the math is brutal. Solar only works when the sun shines, wind when it blows, and AI queries at 3 a.m. Wind and solar simply can’t match that always-on profile without heroic amounts of storage we don’t yet have.
What Happens If the Grid Can’t Keep Up?
Reliability councils have already elevated risk warnings for summer shortfalls in PJM, MISO, and ERCOT. Translation: on the hottest days when everyone cranks the AC, there might not be enough juice left for both your house and the data center next door.
Possible outcomes nobody wants to talk about yet:
- Rolling brownouts or blackouts in affected regions
- Sky-high wholesale power prices (we’ve already seen $9,000/MWh spikes in Texas)
- States competing with tax breaks and relaxed environmental rules to lure data centers – and their jobs
- Regulatory backlash that slows the whole AI build-out
Or, best case, the industry innovates faster than expected: efficiency gains in chips, liquid cooling, maybe even off-peak training schedules that shift load like electric vehicles do today.
The Bottom Line (For Now)
We’re standing at an inflection point. The United States has added roughly 6-8 GW of annual electricity demand growth for decades. Now, in the span of a few years, data centers alone could add that much every single year.
Whether the final number lands at 106 GW, 80 GW, or somewhere in between, one thing feels certain: the era of taking cheap, abundant American electricity for granted is ending. The servers that power our digital lives are about to become some of the biggest customers on the grid.
And if we get this wrong, the lights might not stay on for any of us.
(Word count: ≈3,400)