According to a report from the MIT Energy Iniative’s annual research symposium, until recently, electricity demand in the U.S. has been flat, but the growing AI-ification of nearly everything is ramping it up a lot. The report says that as of now, computing centers use about four percent of the electric supply in this country, and it may go up to 12 – 15 percent in the next few years. “Vijay Gadepally, Senior Scientist at MIT’s Lincoln Lab, emphasized the scale of AI’s consumption: ‘The power required for sustaining some of these large models is doubling almost every three months,’ he noted. ‘A single ChatGPT conversation uses as much electricity as charging your phone, and generating an image consumes about a bottle of water for cooling.’”
“A ChatGPT query, for example, uses 10 times more energy than a standard Google query, said David Porter, a vice president at the Electric Power Research Institute (EPRI).” [Quoted in Time ]
In my opinion there is no doubt there will be a huge increase in demand — I don’t think there’s anything we can do about that in the near future, despite the ongoing efforts to make data centers less power-hungry. For example, Rice University researchers have proposed a method of capturing the waste heat produced by data centers to convert it back to electricity. But for now, I think that’s a losing battle. The demand for energy will grow at a faster rate than will energy-saving measures.
And paradoxically, Sasha Luccioni, an AI researcher and the climate lead at the AI platform Hugging Face, believes that any hardware efficiency gains may be offset by Jevons Paradox, named after a 19th century British economist who noticed that as steam engines became more efficient, Britain’s appetite for coal actually increased. “The more a resource becomes more efficient, the more people will use it,” she said.
In the meantime, our most immediate challenge is supplying the power — we have to focus on the supply side rather than on the demand side; there’s no stopping the growing demand. According to the Time article, there isn't even enough power generation or transmission capacity to fuel the data centers that are already in the pipeline. We can’t solve the problem by lowering the demand for power; we will have to figure out ways to supply it.
But there is no consensus about how to go about doing that. Each of the proposed solutions faces many obstacles. At this point, I can’t see how any of them can be worked out in enough time to keep up with the need. In my opinion, although there are plenty of worthwhile avenues to pursue, the rate at which data-hungry AI is growing is much faster than the time it will take to implement any of the plausible solutions.
One of the serious problems is that driven by the economics of building and operating data centers, as explained in an EPRI white paper , “While the national-level growth estimates are significant, it is even more striking to consider the geographic concentration of the industry and the local challenges this growth can create. Today, 15 states account for 80 percent of the national data center load.” As the Time article points out, it is cheaper for companies to build datacenters in places with robust power sources and existing infrastructure, so many of them cluster together. But that leads to all sorts of problems, such as rolling brownouts in already heavily loaded neighborhoods, draining water resources for cooling, and increasing utility rates.
After decades of work as an EE, SAE Media Group’s Ed Brown is well into his second career: Tech Editor.
“I realized, looking back to my engineering days and watching all of the latest and greatest as an editor, I have a lot of thoughts about what’s happening now in light of my engineering experiences, and I’d like to share some of them now.”
According to the Time article, “Data centers are now the ‘number one issue we hear about from our constituents,” said Ian Lovejoy, a Republican state delegate in Virginia. Aside from quality-of-life concerns, he says that local politicians and residents are worried about data centers threatening electricity and water access, as well as the idea that taxpayers may have to foot the bill for future power lines.”
Microgrids
One proposed solution is to use microgrids for powering data centers. And I think that’s a good idea — it won’t stress the grid. When designing a new data center, the feasibility of a localized independent source of energy should be considered as an integral part of the design spec.
An online white paper makes that case quite well: “Microgrids offer on-site generation that integrates renewable energy sources and lowers a facility’s carbon footprint. They can also optimize energy use while enhancing power stability, reducing reliance on the grid during peak demand when costs are highest.
“For data centers, controlling their energy supply is not just about cost but a strategic necessity. Microgrids, especially ones that include storage, can help data centers meet their commercial and climate objectives in three ways:
- Microgrids allow data centers to operate independently of the main grid during outages or disruptions.
- Energy costs are a significant operational expense for data centers. Storing energy during off-peak hours and using it during peak demand periods can save money and improve efficiency.
- Many data center operators have committed to ambitious climate goals, including achieving carbon neutrality or running entirely on renewable energy. Microgrids can replace traditional diesel backup systems with cleaner sources such as natural gas. They also enable data centers to integrate renewable sources, such as solar or wind power, to offset their carbon footprint.”
I agree with all of that in theory but actually doing it will require a lot of up-front investment. And if renewables are used, storage on a large enough scale to keep a data center running, is, to say the least, a major problem.
The Nuclear Option
One much-discussed option is bringing nuclear reactors back online. A U.S. Department of Energy article from April of this year outlines “some hurdles to clear.” They point out that new reactors will take time to build. And “Most of the costs of nuclear plants are up-front capital construction costs. Initial deployments can carry a high price tag, posing a potential barrier to nuclear energy ramping up to power the data center revolution.” Not to mention the problem of dealing with spent nuclear fuel.
And I know that experts in the field maintain that current reactor designs are many times safer than the old ones. OK, I’ll give them that; so maybe there’s 99.9999 percent certainty that no bad accident will happen. But that means that there is a 0.0001 percent chance that it will. With a normal power plant, I’d say those are great odds, but with a nuclear plant, the consequences of an accident are frightening.
Last Thoughts
It’s a recurring theme with me that solving technical problems is not enough. Actually implementing the technical solutions is a major hurdle. For powering AI, however, the problems are multiplied many times over. Not only are the non-technical hurdles huge, but even the technical solutions are not so clear. And given the multiple problems, it seems to me that there’s a great need for planning and coordination, which I don’t see happening at this moment.
Finally, I tell myself that there could be one bright side to this — it just might motivate the build-out of our renewable infrastructure.

