Tracking the estimated water and electricity consumption of large language models in real-time.
Equivalent to powering 0 homes for a day
Equivalent to 0 standard water bottles
Data updates automatically. Click to force resync.
| AI Model | Est. Daily Users | Est. Queries (24h) | Est. Energy (MWh) | Est. Water (L) |
|---|---|---|---|---|
|
|
~200M | 500M+ | 1,475 | 735,000 |
|
|
~160M | 410M+ | 1,150 | 605,000 |
|
|
~75M | 190M+ | 680 | 320,000 |
|
|
~60M | 150M+ | 490 | 215,000 |
|
|
~15M | 40M+ | 120 | 58,000 |
* Usage stats are aggregate estimates based on monthly active user reports and datacenter thermal design power (TDP) averages.
Training a model like GPT-4 evaporates roughly 700,000 liters of water for cooling.
One AI-generated image uses as much energy as fully charging your smartphone.
If the AI industry were a country, its energy consumption ranking is rising rapidly.
We dive deeper into the code, the ethics, and the sustainability of AI every week.
This project is an open-source initiative to visualize the environmental impact of Large Language Models. Because real-time datacenter metrics are proprietary, the "Live Tracking" visualization is a simulation based on the following research-backed coefficients:
The goal is awareness, not exact scientific measurement. We invite data scientists to contribute better models via GitHub.