The Stephen Wolfram Podcast cover image

Stephen Wolfram Answers Live Questions About ChatGPT

The Stephen Wolfram Podcast

CHAPTER

Chat GPT Training

In chat GPT, it's using an underlying language model that has about 175 billion parameters. The number of parameters in chat GPT is not that different from the number in a brain,. Although it knows about a lot of stuff that is much more obscure than even people like me know about who have decent human memories. I think there's probably a certain amount of templating that's also been done for particular kinds of templated texts. That's just engineering on top of what we call 'machine learning'

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner