The Tom Woods Show cover image

Ep. 2339 Stephen Wolfram on Our AI Present and Future

The Tom Woods Show

00:00

How a Neural Net Can Continue Language

Chat GPT is a neural net that's made from collections of numbers and multiplying numbers. The original version has been trained to finish a sentence at least about 200 billion times every time it produces another word. And so if you type in the cat on the B to chat GPT, it will almost certainly say mat as the next word. So what it has done is its learnt kind of the statistics of the language that exists on the web. But it goes further than that, because there aren't enough sentences on the web to be able to know what the next word will be to some complicated thing you say. We didn't know that the way our brains work were fully captured by neural nets

Play episode from 04:56
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app