Analysis | Features

AI’s Power Problem: Energy Is the Next Big Bottleneck

Author: PPD Team Date: July 29, 2025

A robotic hand reaching towards a digital network of glowing data nodes, symbolizing artificial intelligence and its energy-intensive infrastructure.

 

For a long time, people speculated about how much energy a single ChatGPT query uses. In June, OpenAI CEO Sam Altman finally provided an answer. In his blog The Gentle Singularity, he wrote that the average query consumes about 0.34 watt-hours, roughly what an oven uses in a little over one second, or what an efficient lightbulb uses in a few minutes. Each query also uses about 0.000085 gallons of water, or one-fifteenth of a teaspoon.

That number may seem small. But ChatGPT handles hundreds of millions of queries every day. The cumulative demand is substantial. Altman calls this a “flywheel of compounding infrastructure buildout.” As more people use Artificial intelligence (AI), companies build more data centres. As those systems grow more powerful, they require more electricity, more water, and more hardware. This fuels further demand.

In May 2025, the global venture capital firm BOND released a report titled Trends – Artificial Intelligence. According to the report, AI is reshaping global economies, labour markets, and infrastructure. It is expected to contribute $15.7 trillion to global GDP by 2030.

AI’s power problem

Former Google CEO Eric Schmidt recently warned that electricity, not semiconductors, will be the key constraint on AI development. He said the United States alone may require an additional 92 gigawatts of electricity to support AI growth. That is equivalent to dozens of new nuclear power plants.

Sam Altman shares that view. As data centre production becomes more automated, he writes, the cost of intelligence will eventually approach the cost of electricity. Energy availability, not algorithmic capability, will determine who gets access to advanced AI and how widely it can be deployed.

This is already shaping corporate strategy. Microsoft, for example, is looking to restart decommissioned facilities, including the Three Mile Island nuclear plant, which it aims to relaunch by 2028. Schmidt also pointed to environmental impacts: Microsoft’s water use rose 34 per cent in a year, a trend researchers link directly to increasing AI workloads.

Altman has invested in Helion, a nuclear fusion startup, and companies like AMD and Microsoft are lobbying US policymakers to fast-track energy permits. The energy arms race has begun.

Environmental consequences

Environmental groups, including Greenpeace, warn that unchecked AI expansion could undermine global climate goals. AI is often promoted as a tool to accelerate green innovation. But unless its energy footprint is considered, it risks doing the opposite.

The long-term vision Altman describes is one where intelligence and energy are both abundant. Robots build robots, data centres replicate themselves, and innovation compounds. But this vision requires massive upfront investment in power infrastructure, and it carries significant environmental trade-offs.

Some argue that renewable energy can meet these needs. But scaling solar, wind, and storage systems at this pace is not simple. All have manufacturing-related carbon footprints and require large land areas and rare minerals. The world is not just adding AI to existing energy demand; it is adding the equivalent of several new countries’ worth of electricity and water consumption.

Who pays for the infrastructure?

The challenge is not only how much energy AI needs, but where it comes from, how it is delivered, and who pays for it.

Scaling energy supply requires land, raw materials, and grid infrastructure. Transmission lines need to be extended to reach remote data centres. These upgrades are expensive.

The placement of AI infrastructure is strategic. Data centres do not need to be near cities or people. They can be located wherever land and electricity are cheapest. Because electricity prices vary by region, companies often build in export-constrained transmission areas, regions with high power generation and low local demand.

In contrast, import-constrained areas, such as large cities, face high demand and limited supply, which raises prices. This creates a strong incentive for companies to site facilities in rural or industrial zones where land and energy are cheaper, and public resistance is lower.

Even in low-cost areas, data centres compete with existing users for water and grid capacity. Cooling AI systems requires large amounts of fresh water. Microsoft’s increase in water consumption is not an isolated case. And as workloads grow, so will the pressure on shared environmental resources.

Will efficiency solve the problem?

Some believe energy use will level off as the AI market matures. Once a few dominant models emerge, companies will build dedicated hardware optimised for those models. This could sharply reduce the energy required for each query.

But there is a catch. The field is still moving too quickly. Building custom chips at scale is risky when the underlying software may soon change. A new algorithm or architecture could leave expensive hardware obsolete. For now, most companies rely on general-purpose GPUs, which are far less efficient.

Even with better hardware, total electricity use is expected to rise.

A global benchmark

According to the International Energy Agency (IEA), data centres consumed about 415 terawatt-hours (TWh) of electricity in 2024. That was already about 1.5 per cent of global electricity demand. In its July 2025 report, the IEA projected that this figure could more than double by 2030, reaching 945 TWh, roughly equivalent to Japan’s current electricity consumption.

AI inference and training workloads are driving this growth. Unlike traditional IT services, which have relatively stable demand, AI energy use scales with user growth and model complexity. Even if per-query consumption falls, the overall trajectory points upward.

Water demand is following a similar curve. A few drops per query can add up quickly when scaled across billions of queries per day. The largest data centres require millions of litres of clean water for cooling every day.

India’s AI push and the infrastructure it will demand

The IndiaAI Mission, approved in 2024 with an allocation of Rs 10,300 crore, aims to create one of the world’s largest public AI compute infrastructures. Its first phase has already made 10,000 GPUs available, with plans to expand to nearly 19,000. This capacity is close to two-thirds of what currently supports ChatGPT and will support the development of indigenous language models and AI tools tailored to Indian contexts.

A key focus is affordable access. Through a new national compute facility, startups and researchers will be able to access GPU time at just Rs 100 per hour, far lower than global market rates. India has also launched an open GPU marketplace and initiated plans to build its own GPUs within five years. Meanwhile, five semiconductor fabs are under construction to reduce long-term dependence on imports.

This compute buildout, however, will place new demands on India’s power and water infrastructure. Running and cooling these data centres will require reliable electricity and fresh water in large volumes. The BOND report estimates global data centre capital expenditure reached $455 billion in 2024, but less than 10 per cent of that capacity is located in developing countries like India. While the IndiaAI mission focuses on access and affordability, there is still little clarity on how energy needs will be met, particularly if India aims to keep these compute platforms carbon-neutral or water-efficient.

Beyond infrastructure, India is investing in foundational language models (LLMs), multimodal systems and AI tools which support dozens of Indian languages. 

India’s open data policies, centres of excellence, and digital public infrastructure model have drawn global attention. At the same time, its AI-skilled workforce is growing rapidly. According to Stanford’s AI Index, India ranks first in global AI skill penetration and has seen a 263 per cent increase in AI talent since 2016.  

India’s vision is clear: to become a global AI powerhouse. But meeting that goal will require not just GPUs and models, but a parallel expansion of power generation, water management, and grid capacity to support this new layer of digital infrastructure. 

The real cost of intelligence

The promise of cheap intelligence comes with infrastructure and environmental trade-offs that are anything but cheap. The cost will not be measured only in dollars or power bills. It will show up in land use, water stress, and public subsidies.

If energy becomes the main constraint on AI, the question is not just how to supply more of it. It is who gets to use it, and on what terms. 

AI’s rapid scale also raises geopolitical concerns. The United States currently leads in proprietary models and chips, while China dominates robotics and open-source efforts. India is positioning itself as an AI shaper, focusing on norms, innovation, and multilingual tools.

The featured photograph is for representation only.



Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *