The freeCodeCamp Podcast cover image

#90 Shawn "Swyx" Wang: from Dev to AI Founder

The freeCodeCamp Podcast

00:00

The Importance of Pre-Training Language Models

GPT-3 was trained on 300 billion tokens worth of text. They take hundreds of gigabytes of data scraped from the internet and train them until they have reached some kind of budget that you've predetermined. The goal is just to pass all these tests, right? Because then we have a general intelligence that we can ask to do whatever.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app