John Hopfield is professor at Princeton, whose life’s work weaved beautifully through biology, chemistry, neuroscience, and physics. Most crucially, he saw the messy world of biology through the piercing eyes of a physicist. He is perhaps best known for his work on associate neural networks, now known as Hopfield networks that were one of the early ideas that catalyzed the development of the modern field of deep learning.
EPISODE LINKS:
Now What? article: http://bit.ly/3843LeU
John wikipedia: https://en.wikipedia.org/wiki/John_Hopfield
Books mentioned:
– Einstein’s Dreams: https://amzn.to/2PBa96X
– Mind is Flat: https://amzn.to/2I3YB84
This conversation is part of the Artificial Intelligence podcast. If you would like to get more information about this podcast go to https://lexfridman.com/ai or connect with @lexfridman on Twitter, LinkedIn, Facebook, Medium, or YouTube where you can watch the video versions of these conversations. If you enjoy the podcast, please rate it 5 stars on Apple Podcasts, follow on Spotify, or support it on Patreon.
This episode is presented by Cash App. Download it (App Store, Google Play), use code “LexPodcast”.
Here’s the outline of the episode. On some podcast players you should be able to click the timestamp to jump to that time.
OUTLINE:
00:00 – Introduction
02:35 – Difference between biological and artificial neural networks
08:49 – Adaptation
13:45 – Physics view of the mind
23:03 – Hopfield networks and associative memory
35:22 – Boltzmann machines
37:29 – Learning
39:53 – Consciousness
48:45 – Attractor networks and dynamical systems
53:14 – How do we build intelligent systems?
57:11 – Deep thinking as the way to arrive at breakthroughs
59:12 – Brain-computer interfaces
1:06:10 – Mortality
1:08:12 – Meaning of life