The Stephen Wolfram Podcast cover image

History of Science and Technology Q&A for Kids and Others (March 10, 2021)

The Stephen Wolfram Podcast

CHAPTER

Neural Nets and Back Propagation

The key notion here, which i sould hove, got got developed later. It was this idea of hidden units that you would have the imput layer and output layer. And if it's been if it's something that kind of looks like a be, it's stll in the basin of attraction for a b. Back propagation was a big thing in the early is a sort of a thing people talked about. But, e ah, but it didn't really catch on. Now, our people were actually using yo nets, am in ar for industrial purposes.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner