KEY LEARNINGS
  • Training a single large AI model can generate carbon emissions equivalent to driving a car over 100 times around the Earth.
  • Inference—the ongoing use of the model—often consumes significantly more energy over the system's lifetime than the initial training phase.
  • Data center cooling requires massive amounts of water, with some estimates suggesting a simple conversation consumes half a liter.
  • The carbon footprint of AI is highly dependent on the energy mix of the power grid where the compute infrastructure is located.
  • Sustainable AI governance involves tracking energy metrics and optimizing model size to balance capability with environmental impact.
  • Strubell, E., et al. (2019). Energy and Policy Considerations for Deep Learning in NLP.
  • Patterson, D., et al. (2021). Carbon Emissions and Large Neural Network Training.
  • Luccioni, A., et al. (2022). Estimating the Carbon Footprint of BLOOM.