Why Most Data Centers Never Get Built

6 min read
0 views
May 16, 2025

Ever wonder why so many data centers never get built? Utilities face a flood of fake proposals, clogging the system. What's the real cost of these phantom projects? Click to find out...

Financial market analysis from 16/05/2025. Market conditions may have changed since publication.

Have you ever wondered how the tech giants powering our digital lives plan for the future? Picture this: a utility company sifting through a towering stack of proposals for massive data centers, each promising to gobble up enough electricity to light a small city. The catch? Most of these projects will never see the light of day. It’s a phenomenon that’s throwing a wrench into the plans of utilities and grid operators across the U.S., and it’s got me thinking about how we balance ambition with reality in the race to fuel the AI boom.

The Phantom Data Center Problem

The U.S. electrical grid is drowning in a sea of data center proposals, but only a fraction will ever break ground. Experts estimate that for every data center built, utilities receive five to ten times more interconnection requests. These speculative bids, often dubbed phantom loads, are clogging the system, making it tough for utilities to predict future energy needs. It’s like trying to plan a dinner party when half the RSVPs are from people who might not show up.

The flood of speculative requests is a real headache for utilities trying to keep the lights on.

– Energy industry expert

Why does this happen? Data center developers, from tech titans to scrappy startups, often propose multiple projects to hedge their bets. They’re unsure which sites will secure permits, power, or community approval, so they cast a wide net. But this scattershot approach creates chaos for utilities, who must allocate precious resources to evaluate each request, even the ones destined to fizzle out.

Wildly Divergent Forecasts

Predicting how much power data centers will need in the coming years is like trying to guess the weather a decade from now. Some forecasts are downright apocalyptic. One study projected a jaw-dropping 347 gigawatts of AI-driven power consumption by 2030. Others, more grounded, peg the number closer to 100 gigawatts or even as low as 16.5 gigawatts under sustainable scenarios. That’s a massive gap, and it leaves utilities scratching their heads.

These discrepancies aren’t just academic. They shape how utilities plan for new power plants, transmission lines, and renewable energy projects. Overestimate, and you risk building costly infrastructure that sits idle. Underestimate, and you could face blackouts when demand spikes. It’s a high-stakes guessing game, and phantom loads make it even harder to get right.

Why Developers Overpromise

Data center developers aren’t just throwing darts blindfolded. There’s a method to their madness. Secrecy is a big driver—many hide their plans behind shell companies and nondisclosure agreements to avoid tipping off competitors or sparking local opposition. I get it; nobody wants a NIMBY uprising derailing their multi-billion-dollar project. But this cloak-and-dagger approach means utilities often don’t know which proposals are serious until it’s too late.

Then there’s the issue of queue gaming. Developers flood interconnection queues with multiple proposals, knowing they’ll only build a few. It’s a numbers game: the more spots you grab, the better your chances of securing a prime location. One energy consultant put it bluntly:

When queue positions are cheap, developers will buy them like candy.

– Energy policy analyst

This strategy works for developers but leaves utilities in the lurch, burning through time and money to process requests that go nowhere.

The Utility Dilemma

Utilities aren’t just passive victims in this mess. They’re fighting back with creative solutions, but it’s an uphill battle. Some are rolling out standardized interconnection processes to streamline evaluations. Others are hitting developers with bigger upfront financial commitments—think hefty deposits to prove they’re serious. In a few cases, utilities are even turning to state lawmakers for backup, asking for rules to curb speculative requests.

Take Virginia, the data center capital of the U.S. Three major utilities there recently proposed new rate classes for large loads like data centers. These rules would require developers to pay for a chunk of their contracted demand—up to 80% in some cases—whether they use it or not. It’s a bold move to protect existing customers from footing the bill for infrastructure that might never be needed.

UtilityProposed RuleImpact
Utility A60% minimum demand paymentReduces speculative bids
Utility B80% minimum demand paymentProtects ratepayers
Utility CCollateral for upgradesEnsures project viability

These measures are a start, but they’re not foolproof. Developers are crafty, and some are already finding workarounds, like building their own behind-the-meter power generation to bypass grid delays.

The Rise of Off-Grid Power

Here’s where things get really interesting. As grid interconnection queues grow longer, some data center operators are saying, “Forget it, we’ll make our own power.” Think gas turbines humming away behind the scenes, feeding server racks without ever touching the public grid. It’s a trend that’s picking up steam, and it’s got big implications for the future of energy planning.

For example, one major AI hub reportedly runs on 35 gas turbines, completely off-grid. Another company is eyeing a 1-gigawatt gas-fired power park to serve data centers and other industrial clients. These projects sidestep utility bottlenecks but raise new questions about environmental impact and long-term grid stability. Are we just trading one problem for another?

How Utilities Are Adapting

Despite the chaos, utilities aren’t sitting idle. Many are getting smarter about how they handle data center requests. Some use derating techniques, assuming proposed projects will use less power than claimed or ramp up slowly over time. Others lean on benchmarks like land purchases or permitting progress to gauge a project’s likelihood of moving forward.

One utility planner I came across shared a practical approach:

We bill developers for the staff time spent on their requests. It’s only fair that our members aren’t stuck with the tab.

– Utility resource planner

This kind of cost-sharing is gaining traction, along with service agreements that lock in minimum load commitments. It’s a way to ensure data centers pay their fair share for the grid upgrades they trigger.

A Call for Standardization

Some experts argue the real fix lies in standardizing the interconnection process across regions. Imagine a world where utilities share anonymized data on queued projects, developers face clear commercial readiness tests, and nonviable proposals get booted faster. It’s a pipe dream for now, but it could cut down on phantom loads and speed up the queue.

I’m all for this idea, but I wonder if it’s enough to tame the wild west of data center development. Developers are savvy, and they’ll always find ways to game the system. Maybe the answer lies in a reverse auction model, where utilities prioritize projects that need the least support to connect. It’s a radical idea, but it could shake things up.

What’s at Stake

The stakes here are huge. Misjudge data center growth, and you could end up with an overbuilt grid that jacks up rates for everyone. Or worse, underestimate demand and watch the lights flicker when AI workloads surge. Utilities are walking a tightrope, and phantom loads are making it wobble.

But there’s a silver lining. The pressure from data centers is pushing utilities to innovate. They’re investing in renewable energy, streamlining processes, and rethinking how they serve big customers. One cooperative, for instance, snagged an $812 million federal grant to bring nearly 1.3 gigawatts of clean power online. That’s the kind of forward-thinking we need.

Looking Ahead

So, where do we go from here? The data center boom isn’t slowing down, and neither is the AI revolution driving it. Utilities will need to keep adapting, balancing the needs of tech giants with those of everyday ratepayers. It’s a messy, complicated challenge, but it’s also an opportunity to build a smarter, more resilient grid.

In my view, the key is transparency. If developers and utilities can find a way to share more data without compromising competitive edges, we might see fewer phantom loads gumming up the works. Until then, it’s a bit like navigating a foggy road—you’ve got to proceed with caution, but you can’t stop moving forward.


What do you think? Are utilities up to the task, or are we headed for a gridlock that could stall the tech revolution? One thing’s for sure: the future of our digital world depends on getting this right.

You must always be able to predict what's next and then have the flexibility to evolve.
— Marc Benioff
Author

Steven Soarez passionately shares his financial expertise to help everyone better understand and master investing. Contact us for collaboration opportunities or sponsored article inquiries.

Related Articles