Cerebras-GPT: A Family of Open, Compute-efficient, Large Language Models Cerebras open sources seven GPT-3 models from 111 million to 13 billion parameters. Trained using the Chinchilla formula, these models set new benchmarks for accuracy and compute efficiency. Abstract State-of-the-art language models are extremely challenging to train; they require huge compute budgets, complex distributed com