GPU Capabilities:
Computational power:
- Roughly doubling every 1-2 years (following a modified Moore's Law)
- Example: NVIDIA's A100 (2020) to H100 (2022) saw about 3x performance increase
Memory capacity:
- Increasing by about 2x every 2-3 years
- Enables training of larger models and processing of bigger datasets
Energy efficiency:
- Improving by about 1.5x to 2x per generation
- Allows for more sustainable and cost-effective AI training and inference
Specialized AI architectures:
- Development of AI-specific chips and architectures optimized for machine learning tasks