
History of Science and Technology Q&A for Kids and Others (March 10, 2021)
The Stephen Wolfram Podcast
Perceptrons Are One Layer Neural Networks
There was some sort of early, kind of very simple neural net kinds of things. Like marvin minsky, for example, a built this thing called the snark. But they failed to understand, really dig ing to understanding what that random noise is. They might have discovered some of the things i found out about, things like cellar autometa many years later if they'd kind of dug into that. At the time, they're really concentrating on how can we make these kind of artificial nural networks do something useful?
00:00
Transcript
Play full episode
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.