UNESCO Warns Generative AI May Drain Global Energy, Water Resources
UNESCO urges urgent reforms to curb AI’s soaring energy and water use, highlighting risks to global sustainability.
Generative artificial intelligence is rapidly reshaping global digital infrastructure but is also accelerating environmental degradation through massive energy and water consumption, UNESCO said in a new report urging urgent reforms for a “clean by design” approach.
The report, titled “Smarter, Smaller, Stronger: Resource-Efficient Generative AI & the Future of Digital Transformation,” highlights how training and operating large language models like ChatGPT could rival the energy usage of some nations while placing increasing strain on critical natural resources.
AI Boom Driving Energy Surge
The energy needed to train state-of-the-art generative models now reaches about 50 gigawatt-hours per model, roughly equal to the annual electricity use of several developing countries. However, the greater concern is not training, but inference — the phase in which users interact with models.
As of June 2025, ChatGPT receives about 1 billion user prompts daily, each consuming approximately 0.34 watt-hours, according to OpenAI CEO Sam Altman. That translates to an estimated 124 GWh annually, equivalent to the electricity consumption of 1.3 million Ethiopians.
The compute demand from AI is doubling every 100 days, the report noted, with data center energy use increasing 12 percent annually since 2017, outpacing overall electricity growth by a factor of four.
Hidden Water Footprint
Data centers supporting AI also depend heavily on water for cooling, much of it potable. Projections indicate that global water usage from companies like Google, Microsoft and Meta could reach 6.6 billion cubic meters by 2027, exceeding Denmark’s annual consumption.
“AI’s expansion is not just an energy issue—it’s a water, resource and equity issue,” the report warned.
Risks for the Global South
The benefits of generative AI remain concentrated in well-resourced nations. As of 2024, some 2.6 billion people — 32 percent of the global population — remain offline.
In Africa, only 5 percent of AI researchers have sufficient computing access, and the continent hosts less than 1.5 percent of global data center capacity.
The proliferation of energy-intensive AI in low-resource areas also raises risks of deepening social and environmental inequalities.
Rethinking the AI Lifecycle
The report dissects the full AI lifecycle — spanning inception, development, deployment, operation and retirement — and identifies inference as the most critical energy choke point.
“Training is energy-intensive, but inference occurs billions of times per day. Its cumulative impact is far larger,” the report stated.
To address this, UNESCO proposes embedding efficiency directly into AI architecture — from prompt length to model selection — prioritizing small, optimized models over massive general-purpose ones.
Experiments Point to Major Savings
UNESCO’s research team conducted real-world experiments using Meta’s LLaMA 3.1 8B model and open-source fine-tuned models. The findings showed:
- Model compression via quantization cut energy use by up to 44 percent without compromising accuracy.
- Shorter responses yielded a 54 percent reduction in energy use compared to longer outputs.
- Task-specific small models saved up to 90 percent in energy while maintaining or even improving performance.
For example, reducing a model’s average response length from 300 to 150 words could power 20,000 U.K. households daily. Using quantized or smaller models pushes that saving to over 30,000 homes.
Architectural Overhaul on the Horizon
Emerging strategies promise even greater efficiencies. These include:
- Mixture of Experts: Activating only a few “expert” sub-models per query.
- Sparse computation: Engaging only necessary model layers for a given task.
- Retrieval-Augmented Generation: Combining lightweight models with external data sources.
- Neurosymbolic systems: Mimicking the brain’s energy-efficient processing methods.
These designs could dramatically cut energy costs, but remain in early stages of development.
UNESCO’s Three-Point Plan
The report outlines three core recommendations to curb AI’s growing environmental toll:
1. Mobilize investment for clean AI
Governments and private sectors must invest in energy-efficient, “clean by design” systems. Public procurement should prioritize low-carbon AI models.
2. Incentivize transparency and accountability
Introduce sustainability labels, mandatory reporting on energy and water use, and independent audits. Encourage eco-conscious innovation through public policy.
3. Boost AI literacy and public awareness
Equip users, policymakers, and developers with the tools to understand and minimize AI’s environmental costs. Encourage responsible usage patterns.
A Sustainable AI Future
“Tackling AI’s environmental toll is not just about tech—it’s about fairness, equity, and the planet’s future,” the report concludes. UNESCO is committed to advancing small-scale, frugal AI systems and providing technical guidance to low-resource countries.
As AI continues to reshape economies and societies, the organization urged a shift in how the world builds and uses these tools, prioritizing sustainability at every level.