A comprehensive analysis published by Andy Masley demonstrates that individual use of ChatGPT and other large language models (LLMs) has minimal environmental impact. The study shows that a single ChatGPT query consumes approximately 3 watt-hours of energy, equivalent to watching 10 seconds of streaming video or running a space heater for 2.5 seconds.
The research reveals that concerns about AI’s environmental footprint have been largely overstated. While ChatGPT’s total operation uses energy equivalent to 20,000 American households, this must be contextualized against its 300 million daily users. For comparison, video streaming services like Netflix consume energy equivalent to 800,000 households.
Regarding water usage, each ChatGPT query requires about 30 milliliters of water, primarily for cooling data centers. This amount is significantly less than other common activities: a single hamburger requires water equivalent to 20,000 ChatGPT queries.
The study also addresses the environmental cost of training AI models. GPT-4’s initial training consumed energy equivalent to 200 flights between New York and San Francisco, but this one-time cost is distributed across billions of subsequent queries. When calculated per query, the training adds only 0.3 watt-hours to each search.
Masley argues that focusing on individual AI use distracts from more significant climate challenges. He emphasizes that major environmental improvements will come from systematic changes in energy infrastructure rather than limiting personal use of digital tools.