ArchiCraft: Solution Architecture Insights for AI Engineering cover image

#002 - How long to train a 70B LLM on 15T tokens using 1024 H100s?

ArchiCraft: Solution Architecture Insights for AI Engineering

00:00

Estimating Training Time for Large Language Models

This chapter explores the complexities involved in training a large language model with a dataset of 15 trillion tokens and examines various precision methods, such as FP8 and BF16, on training duration. The discussion also contrasts different estimation methods for calculating training time, affirming the accuracy of their findings through verification techniques.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app