The Rhys Show cover image

The Rhys Show

Neuroscientist Konrad Kording reveals shocking truth about machine learning and the brain

Apr 3, 2023
52:52

Episode guests

Podcast summary created with Snipd AI

Quick takeaways

  • Gradient descent simplified the process of building neural networks and implementing learning rules, accelerating progress in AI.
  • The rapid advancements in language models raise existential concerns about the compression of human thought patterns and the future impact on human intelligence.

Deep dives

The history of deep learning and the role of constraints

The podcast episode delves into the history of deep learning and how constraints have played a crucial role in its development. At first, building neural networks and implementing learning rules was challenging, but the introduction of gradient descent simplified the process. The evolution of AI and the concept of deconstraints are explored, with examples such as biological deconstraints and their parallels in machine learning. The episode emphasizes how the removal of constraints has accelerated progress in the field.

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner