MIT Technology Review’s new analysis provides a comprehensive look at the energy consumption of artificial intelligence systems, revealing significant environmental impacts that are often overlooked. The research, part of the “Power Hungry: AI and our energy future” series, examined AI’s energy demands down to individual queries and traced the industry’s expanding carbon footprint.
According to the analysis, data centers doubled their electricity consumption between 2017 and 2023, with AI hardware driving much of this increase. Currently, data centers consume 4.4% of all US energy, and projections from Lawrence Berkeley National Laboratory suggest that by 2028, AI alone could use as much electricity as 22% of all US households.
The study found that AI data centers tend to use electricity with 48% higher carbon intensity than the US average, partly because they’re often located in regions with coal-heavy power grids. While tech companies like Meta, Amazon, and Google have pledged to increase nuclear power use, fossil fuels still dominate the energy mix powering AI infrastructure.
Researchers warn that AI’s energy demands will grow substantially as the technology evolves beyond simple queries to more complex applications like AI agents, voice interactions, and reasoning models. According to AI researcher Luccioni, “The precious few numbers that we have may shed a tiny sliver of light on where we stand right now, but all bets are off in the coming years.”
As AI becomes integrated into more aspects of daily life, from customer service to healthcare, transparency about its energy requirements and emissions becomes increasingly critical for planning sustainable AI development.