Computer hardware requirements for cutting-edge AI systems vary significantly by domain

Data reflects number of NVIDIA H100s (today's most powerful AI accelerator) required to train a leading AI model in 90 days. Chip requirements are calculated based on the amount of compute used to train the current state-of-the-art model for each task, using publicly available training compute data from Epoch, a research institute. Assumed hardware FLOP utilization is 34%.
Source: Epoch