This item provides a historical look at the training costs and performance metrics of OpenAI’s GPT-2 model from 2019. It details the hardware used, training duration, estimated cost of $43K, and its CORE score on various evaluations. This offers a benchmark for understanding the evolution of large language model development expenses and capabilities.
Source: Simon Willison