KEY LEARNINGS
- Training a single large AI model can generate carbon emissions equivalent to driving a car over 100 times around the Earth.
- Inference—the ongoing use of the model—often consumes significantly more energy over the system's lifetime than the initial training phase.
- Data center cooling requires massive amounts of water, with some estimates suggesting a simple conversation consumes half a liter.
- The carbon footprint of AI is highly dependent on the energy mix of the power grid where the compute infrastructure is located.
- Sustainable AI governance involves tracking energy metrics and optimizing model size to balance capability with environmental impact.
- 📰MIT Technology Review: AI Environmental ImpactAnalysis of AI's growing environmental footprint.
- 🌐OECD: Measuring AI Environmental ImpactsInternational framework for AI sustainability metrics.
- 🌐Green AI (Allen Institute)Research initiative on efficient AI development.
- Strubell, E., et al. (2019). Energy and Policy Considerations for Deep Learning in NLP.
- Patterson, D., et al. (2021). Carbon Emissions and Large Neural Network Training.
- Luccioni, A., et al. (2022). Estimating the Carbon Footprint of BLOOM.





