KEY LEARNINGS
- AI relies on specialized hardware called GPUs that can perform trillions of simple mathematical operations simultaneously, unlike standard CPUs.
- The cost of 'training' a frontier model is a massive one-time investment, while 'inference' is the recurring cost of using that model.
- NVIDIA currently dominates the AI hardware market, creating a significant concentration risk and supply chain bottleneck.
- Scaling laws indicate that increasing compute power consistently improves AI performance, driving an expensive arms race.
- Compute governance focuses on tracking hardware usage, managing cloud costs, and navigating geopolitical export controls.
- 🌐NVIDIA Data Center ProductsOfficial hardware specifications and pricing.
- 🌐Epoch AI: Hardware TrendsResearch on AI compute growth and trends.
- 📄Stanford HAI AI Index Report 2024Comprehensive compute cost analysis.
- NVIDIA. (2024). Data Center Products and Hardware Specifications.
- Epoch AI. (2024). Trends in Machine Learning Hardware.
- Stanford HAI. (2024). AI Index Report 2024.





