FLOP (Floating Point Operations)
FLOP is a fundamental unit of measurement for the performance of computer systems, denoting the number of floating-point operations performed per second. In the field of artificial intelligence, this metric serves to quantify the computational capacity (compute) required to train and run models, often measured in peta- or exa-scale magnitudes. Knowledge of FLOP values is essential for designing hardware resources (GPUs, TPUs) and optimizing the cost-effectiveness of models. High FLOP counts characterize the training of the most advanced AI models, showing a direct correlation with the model's level of intelligence and capabilities.