The Rhys Show cover image

The Rhys Show

Neuroscientist Konrad Kording reveals shocking truth about machine learning and the brain

Apr 3, 2023
52:52

Episode guests

Podcast summary created with Snipd AI

Quick takeaways

  • Gradient descent simplified the process of building neural networks and implementing learning rules, accelerating progress in AI.
  • The rapid advancements in language models raise existential concerns about the compression of human thought patterns and the future impact on human intelligence.

Deep dives

The history of deep learning and the role of constraints

The podcast episode delves into the history of deep learning and how constraints have played a crucial role in its development. At first, building neural networks and implementing learning rules was challenging, but the introduction of gradient descent simplified the process. The evolution of AI and the concept of deconstraints are explored, with examples such as biological deconstraints and their parallels in machine learning. The episode emphasizes how the removal of constraints has accelerated progress in the field.

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode