3min chapter

The freeCodeCamp Podcast cover image

#90 Shawn "Swyx" Wang: from Dev to AI Founder

The freeCodeCamp Podcast

CHAPTER

The Importance of Pre-Training Language Models

GPT-3 was trained on 300 billion tokens worth of text. They take hundreds of gigabytes of data scraped from the internet and train them until they have reached some kind of budget that you've predetermined. The goal is just to pass all these tests, right? Because then we have a general intelligence that we can ask to do whatever.

00:00

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode