The amount of compute used to train AI systems has been increasing since 1950, the rate of increase increased in 2010

Amount of compute used to train notable AI models

FLOP (floating-point operations) refers to the total number of computer operations used to train an AI system. Computation is estimated based on published results in the AI literature and comes with some uncertainty. Epoch expect most of these estimates to be correct within a factor of 2, and a factor of 5 for recent models for which relevant numbers were not disclosed, such as GPT-4.
Chart: Will Henshall for TIME Source: Epoch via Our World in Data
TIME