KEY LEARNINGS
  • AI relies on specialized hardware called GPUs that can perform trillions of simple mathematical operations simultaneously, unlike standard CPUs.
  • The cost of 'training' a frontier model is a massive one-time investment, while 'inference' is the recurring cost of using that model.
  • NVIDIA currently dominates the AI hardware market, creating a significant concentration risk and supply chain bottleneck.
  • Scaling laws indicate that increasing compute power consistently improves AI performance, driving an expensive arms race.
  • Compute governance focuses on tracking hardware usage, managing cloud costs, and navigating geopolitical export controls.
  • NVIDIA. (2024). Data Center Products and Hardware Specifications.
  • Epoch AI. (2024). Trends in Machine Learning Hardware.
  • Stanford HAI. (2024). AI Index Report 2024.