The Stephen Wolfram Podcast cover image

Stephen Wolfram Answers Live Questions About ChatGPT

The Stephen Wolfram Podcast

00:00

Chat GPT Training

In chat GPT, it's using an underlying language model that has about 175 billion parameters. The number of parameters in chat GPT is not that different from the number in a brain,. Although it knows about a lot of stuff that is much more obscure than even people like me know about who have decent human memories. I think there's probably a certain amount of templating that's also been done for particular kinds of templated texts. That's just engineering on top of what we call 'machine learning'

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app