AI’s energy wake-up call

Bitcoins

bitcoins Hari Subramanian

Opinion

Feb 17, 20267 mins

Bitcoins
Why AI’s insatiable demand for power is a constraint that should matter to the enterprise AI strategy.

For the last couple of years, enterprises have doubled and tripled down on investments in AI-related initiatives. The board-level leadership at the Global 1000 enterprises has high expectations of the value that can be unleashed through the power of AI in growing the business as well as driving efficiencies. CIO and CAIO organizations have spent a lot of time and energy on addressing constraints such as data availability, governance and change management to enable value realization through AI initiatives. For the most part, enterprises have assumed unconstrained access to AI infrastructure through hyper-scalers. The exponential reduction in token costs has enabled broader adoption of AI use cases within enterprises.

As generative AI, agentic systems and real-time inference move from pilots into production, CIO’s and enterprise AI leaders would be well advised to be cognizant of another constraint: energy availability and its impacts AI cost, scale and risk. Power, cooling and physical infrastructure and have not been commonly discussed within CIO organizations as a potential constraint.

This article outlines AI’s energy wake-up call and why CIOs should pay attention to the impact of energy on AI while building their longer-term AI strategy.

AI’s demand for power

According to a study by Bain and Company, AI Data centers and their commensurate needs for power are growing exponentially and are pushing towards 100 Gigawatts of new power demand in the US by 2030. For reference, the data center power demand in 2025 is about 30 Gigawatts and this represents a 35% CAGR over the next 5 years.

Hyper-scalers are investing billions of dollars and are racing to build data centers. Investment in data centers are expected to reach $ 1 trillion within three years, according to a study by Deloitte.

However, they face several challenges, such as:

  • Grid stress. AI data centers can create large, concentrated clusters of 24 by 7 power demand, causing grid stress. Building new transmission lines can take between 7 and 10 years.
  • Sources of power are constrained. Currently, most of the demand is fueled by natural gas. There are significant supply challenges with other traditional sources, such as coal and renewable sources, such as wind and solar. Nuclear capacity has not been developed in the US for several decades, adding to the potential energy shortage.
  • Cooling requirements. Data centers use several cooling techniques, such as liquid cooling, air cooling, etc. The massive cooling requirement poses an enormous stress on water requirements. This also creates contention for water use by citizens in the municipality, agricultural needs, etc. and in some parts of the country have already created a political backlash.
  • Regulatory constraints. In some states and municipalities, the permitting process may take a long time with a lot of uncertainties.
  • Equipment shortage. The data centers and their power infrastructure are dependent on several components — from steel and copper to switchgear and transformers. In many instances, the demand for these goods far exceeds the supply.
  • Talent shortage. The data center industry needs more trade skills, such as electricians and construction workers and there is a big gap between supply and demand.

There is also a concern about the ROI and economics of the aggressive data center and AI infrastructure build-out. Most of the hyper-scaler and AI infrastructure companies, such as Google and Microsoft, are moving from being asset-light to asset-heavy companies, raising concerns about their long-term valuations.

Dual requirements between training and inference

Data centers and AI infrastructure will also have to deal with different requirements — power-hungry training needs versus latency-critical inference. Training workloads need high power densities and advanced liquid cooling techniques. Inference workloads need lower power densities, but low latency is critical and this would mean co-location with the applications and data workloads. In addition, inference workloads require creating a high-volume “always-on” background load.

According to a McKinsey study, AI compute is moving towards an inference-heavy future with training and inference needs being approximately 50-50 by 2030.

Data centers will have to align with this reality and combine both training and inference capabilities within the same campus.

Mitigation initiatives

1. Government and industry partnership

The industry has worked aggressively with the Federal Government to heighten the awareness of the energy needs for AI and is working on several strategies to mitigate this constraint. Power generation is one area where the US is playing catch-up with China, and there is a concern that this might make all the difference when it comes to winning the AI race.

2. Innovation and investments

  • Chip design. Several architectural techniques, such as in-memory, wafer-scale and photonic, are being developed to drive down energy consumption.
  • Software techniques. Several software techniques can dramatically reduce energy consumption during AI model training without requiring new hardware infrastructure. One example of such a technique is “Hyperparameter optimization and early stopping.”
  • Advanced cooling solutions. These include “Direct to chip liquid cooling” and “immersion cooling,” which have the potential to reduce energy consumption significantly.
  • Investments. Electric and gas utility capex is expected to surpass US $1 trillion cumulatively within the next five years (2025–2029), and most of this is driven by AI demand.

What CIO’s should take into consideration

CIO’s need to be aware of the above-mentioned correlation between energy constraints and the future availability of AI capacity and take this into account as part of their longer-term AI roadmap. Here are some of the practical steps that the CIO’s should consider:

Incorporate the impact of energy into their AI ROI models

Power and cooling costs can materially change the economics of AI initiatives over time.

In traditional enterprise IT, power costs were a background consideration. In the AI world, the impact of power and cooling is in the critical path.

Demand energy transparency from vendors

CIO’s should ask cloud and colocation partners about regional constraints, power sourcing and expansion timelines. Data Centers designed before large-scale GPU adoption might have limitations to scalability.

Coordinate beyond IT

CIO’s should involve Facilities, procurement, finance and sustainability teams to seek their inputs and they can help impact AI success.

Plan for resilience, not just scale

If all AI workloads live in one environment, risk is concentrated. Hybrid models become more relevant for energy predictability and risk management.

Mission-critical AI systems may justify on-site generation or microgrid investments.

As enterprises move towards adopting AI at scale, CIO’s, CAIO’s and other leaders should pay close attention to the data center economics that is driven significantly by energy considerations. This will enable them tune their AI strategy to ensure that there is more predictability in costs and enable them to deliver the ROI that the business expects.

This article is published as part of the Foundry Expert Contributor Network.
Want to join?

SUBSCRIBE TO OUR NEWSLETTER

From our editors straight to your inbox

Get started by entering your email address below.

bitcoins Hari Subramanian

Hari Subramanian is the co-founder and CEO of Jeeno Technologies, an AI services and platform company that focuses on building responsible AI systems and business process automation. Previously, he led engineering for the artificial intelligence center of enablement at a Fortune 15 global healthcare organization in the US. In this role, he leveraged generative AI to deliver business value at scale to drive business growth and operational efficiencies. As an early adopter of AI governance practices, Hari led the building of a technical platform for AI governance across the enterprise.

During his career at this enterprise, he has led data integration, data engineering and claim platform engineering functions to drive digital transformation. Hari also brings a wealth of technology consulting experience across healthcare, life sciences and financial services domains. He has advised several start-up and scale-up enterprises on strategy and operations. Hari holds a bachelor’s degree in engineering from Anna University, India. In addition, he has certifications in executive leadership from Cornell University and in application of generative AI from MIT.

Bitcoins More from this author

Bitcoins Show me more

Lyndia Grumbles Read More

Latest

Newsletter

Don't miss

Famous birthdays for April 5: Sterling K. Brown, Mike McCready

Music 1 of 3 | Sterling K. Brown arrives...

Yashraj, Abdon Mech, Divyam Sodhi and All The Songs to Know This Week

Music From pop-rock band Last Minute India’s inward-looking new...

Starmer ‘deeply concerned’ by Kanye West’s UK festival booking

Music You don't have permission to access "http://news.sky.com/story/keir-starmer-deeply-concerned-by-kanye-wests-wireless-festival-booking-despite-antisemitic-remarks-13528071"...

The Vogue Business Funding Tracker

Introducing the Vogue Business Funding Tracker, a running list highlighting the most notable and intriguing investment and M&A activity in fashion and beauty. From emerging disruptors to legacy giants undergoing major changes, we spotlight the deals that are shifting the dynamics of the sectors we cover, including fashion, beauty, tech and sustainability. April 2026 Icicle

Family Business? Tee Grizzley Reacts After His Mom Accuses Him Of Leaving Her To Struggle (PHOTOS)

Y’all… it looks like some family tension might be brewing behind the scenes involving Tee Grizzley and his mom. What seemed like a regular social media post quickly turned into something deeper. And now, folks are side-eyeing the situation and wondering what’s really going on. RELATED: Tee Grizzley Shares A Message For Artists After His

SoE necessary but not sufficient, business leaders say

PE­TER CHRISTO­PHER Se­nior Mul­ti­me­dia Re­porter pe­ter.christo­pher@guardian.co.tt Heavy hand­ed but nec­es­sary giv­en the state of crime in T&T. This was a com­mon as­sess­ment from var­i­ous busi­ness groups when asked for their per­spec­tive on the lat­est de­c­la­ra­tion of a state of emer­gency in the coun­try. The T&T Cham­ber of In­dus­try and Com­merce, in a re­leased is­sued yes­ter­day