Google Says AI Prompts Use Less Energy Than Watching 9 Seconds of TV
New Google study shows Gemini AI queries consume minimal energy, far lower than earlier environmental impact estimates.
Google said Friday that a typical query to its Gemini artificial intelligence assistant uses less electricity than keeping a television on for nine seconds, releasing new research that shows the technology’s environmental footprint is far lower than many previous estimates.
The company’s new technical paper estimates that the median Gemini text prompt consumes 0.24 watt-hours of energy, emits 0.03 grams of carbon dioxide equivalent and uses 0.26 milliliters of water or about five drops.
A narrower calculation that only counts active chips puts the figure at 0.10 Wh of energy, 0.02 gCO2e and 0.12 mL of water.
“Measuring AI’s real impact means looking beyond just the active machines,” the study said. “Idle capacity, CPUs, memory and cooling systems all play a critical role in true operating efficiency.”
Efficiency Gains Over Time
The report highlights steep reductions in Gemini’s footprint over the past year. Between May 2024 and May 2025, the energy required per prompt fell 33-fold and the total carbon footprint dropped 44-fold, even as the system produced faster and higher-quality responses.
Google attributed the gains to advances in software, hardware and operations. Speculative decoding allows smaller models to handle initial predictions before being verified by larger ones, reducing processing steps.
Distillation produces smaller, optimized models, such as Gemini Flash and Flash-Lite, that are less resource-intensive to run.
The company has also invested in custom-built tensor processing units, which it says deliver much higher performance per watt than general-purpose chips. Its newest version, Ironwood, is 30 times more energy-efficient than the first publicly available TPU.
Challenging Industry Estimates
Public estimates of AI’s footprint have varied widely, with some studies suggesting that a single prompt could consume as much energy as boiling a kettle.
Google said many such calculations overlook critical factors, including the energy used by idle machines provisioned to ensure reliability and the overhead from data center cooling and power distribution.
By including those factors, the company said its methodology provides a more realistic picture of AI’s operational footprint.
“We believe this is the most complete view of AI’s overall footprint,” the paper said, adding that it hoped other companies would adopt similar standards to allow for fair comparisons.
Push for Responsible AI
The findings come as AI use surges worldwide, raising concerns about its environmental impact alongside its economic potential.
Google’s data centers now operate at a fleet-wide average power usage effectiveness of 1.09, among the best in the industry.
The company is also pursuing 24/7 carbon-free energy across its operations and aims to replenish 120 percent of the freshwater it consumes. By reducing the energy intensity of its AI systems, Google said water use from cooling has also fallen.
“AI demand is growing, and with it the responsibility to reduce its power and water needs,” the study said. “We aim to drive industry-wide progress toward more efficient AI.”