A recent Wall Street Journal article entitled “Data Centers That Don’t Exist Yet Are Already Haunting the Grid” captured perfectly a topic that I’ve been thinking a lot about lately – namely, just how the world will generate enough electricity to feed the ever-growing hunger of artificial intelligence (AI)? It appears to me that conventional energy policies designed to address climate change and stimulate innovation have inadvertently created energy constraints that further slow down AI growth potential.
AI – especially generative AI, large language models (LLMs), and massive data centers – is no longer a fringe concern. It’s rapidly becoming core infrastructure for many sectors. But with that comes a steep rise in demand for electricity, raw materials, transmission, and generation capacity. Without major investment, many regions – including the U.S. – risk hitting serious bottlenecks.
Below, I’ll lay out what’s happening, where the pressure points are, and what short-term solutions exist to bridge the increasing gap between new demand and current supply of power. The rest of the world will mirror the U.S. as AI data centers continue to scale globally.
Data about the scale of AI’s energy demand:
- According to the International Energy Agency (IEA), global electricity demand from data centers is projected to more than double by 2030, to around 945 terawatt‐hours (TWh). AI‐optimized data centers (i.e., those built or adapted specifically for AI workloads) are expected to increase their electric consumption by more than fourfold in that timeframe.
- For the U.S., data centers already consumed ~4.4% of total U.S. electricity use in 2023. That share is expected to rise to somewhere between 6.7% and 12% by 2028, depending on how fast AI and data center builds grow.
- BloombergNEF (BNEF) projects that U.S. data‐center power demand will more than double between 2024 and 2035, rising from ~35 gigawatts (GW) of power demand in 2024 to ~78 GW in 2035.
- Another estimate (from Columbia’s work on LLMs in the U.S.) suggests AI data centers could require ~14 GW of additional new capacity by 2030.
- Globally, a study by RAND projects that AI data centers could need 68 GW by 2027 (just for the growth in compute under certain scenarios) and much more by 2030 if exponential trends continue.
Given the above, it’s clear to all that the AI revolution is accelerating fast, and energy demands are a key part of its foundation. So, simple to fix, right?
Not so fast.
Why Our Existing & Future Build Power Generation May Not Be Enough…
There are several reasons why the current & future build-out of new electricity generation, transmission, and regulatory structures may struggle:
Generation capacity constraints
Building new power plants takes time: planning, permitting, financing, and connecting to the grid. There are limits to how fast new capacity can come online. If AI data center demand surges faster than these capacities can be built, shortages and higher costs will result.
Grid infrastructure & transmission
Even if you have generation, getting that electricity reliably to data centers (which are often very large loads, concentrated in certain geographies) requires transmission lines, substations, and grid upgrades. Local grids are already constrained, leading to reliability risks.
Regulatory and permitting bottlenecks
In many parts of the world, including the U.S., getting permits for power plants, transmission lines, and large data centers is slow and fraught with legal and environmental hurdles. These can delay infrastructure buildout by years.
Energy mix and environmental impact
Much of the current U.S. electricity still comes from fossil fuels. As data centers grow, so do their CO₂ emissions unless the marginal generation is clean. There’s also water usage, land use, etc. The U.S. is seeing that environmental burden already: more than half of data center electricity in the U.S., in some recent assessments, is derived from fossil sources.
Efficiency and waste
Some studies estimate that up to ~30% of power used in AI training/inference is “wasted” in inefficiencies – i.e., grid constraints, suboptimal hardware use, cooling inefficiencies, etc. Reducing waste first and foremost is the best approach – i.e., return on investment – by far.
Coupling the new AI demand for generation and the challenges for adding such new generation, the most logical approach is to focus on current & existing wasted energy, for such “wasted electrons” are abundant.
The Most Cost-Effective & Immediate Solution
Enter Soluna (Nasdaq: SLNH).
At Soluna, we focus on powering clean data centers with wasted electricity – almost solely derived from intermittent renewable energy sources – thereby adding both a “new source” of electricity for AI demands as well as increasing the efficiency of the overall power networks. We’ve recently surpassed 1 GW of renewable computing projects with a groundbreaking at Project Kati in Texas. Now, we’re scaling even further with a $100M credit facility from Generate Capital to finance our pipeline of clean data centers.
The future of computing is renewable. And we’re building it.
Soluna believes that solving the energy demands of the AI revolution is key to its ability to deliver for humanity. Many challenges exist, for sure, but there exists a more efficient way to deliver immediate societal wins. Something we call the Soluna way, where we capture & use wasted electricity to deliver cost-effective AI computational solutions today.
The future is bright. The future is Soluna.
Learn more about Soluna visit → https://soluna.newbird.co/