The freeCodeCamp Podcast cover image

#90 Shawn "Swyx" Wang: from Dev to AI Founder

The freeCodeCamp Podcast

CHAPTER

The Importance of Pre-Training Language Models

GPT-3 was trained on 300 billion tokens worth of text. They take hundreds of gigabytes of data scraped from the internet and train them until they have reached some kind of budget that you've predetermined. The goal is just to pass all these tests, right? Because then we have a general intelligence that we can ask to do whatever.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner