The next generation of AI models could be large enough to qualify for the requirements in the Executive Order

Amount of compute used to train notable AI models

FLOP (floating-point operations) refers to the total number of computer operations used to train an AI system. Computation is estimated based on published results in the AI literature and comes with some uncertainty. Epoch expect most of these estimates to be correct within a factor of 2, and a factor of 5 for recent models for which relevant numbers were not disclosed, such as GPT-4.
Chart: Will Henshall for TIME Source: Epoch
TIME