Machine Learning Street Talk (MLST) cover image

Machine Learning Street Talk (MLST)

Jürgen Schmidhuber - Neural and Non-Neural AI, Reasoning, Transformers, and LSTMs

Aug 28, 2024
Jürgen Schmidhuber, the father of generative AI and a pioneer in deep learning, shares insights on neural and non-neural AI. He discusses the evolution of LSTMs and linear transformers, highlighting their importance in modern AI systems. Topics include the intricacies of reasoning in AI, the balance between memorization and generalization, and the potential breakthroughs in reducing computational needs. Schmidhuber also critiques public misconceptions about AI and explores advanced AI planning methods as pathways to achieve AGI.
01:39:39

Podcast summary created with Snipd AI

Quick takeaways

  • The podcast emphasizes that advancements in large language models do not equate to imminent AGI, as these systems lack true reasoning and creativity.
  • Integration of symbolic AI with neural networks can resolve computational challenges that purely neural approaches face, improving overall AI capabilities.

Deep dives

Understanding AGI and Large Language Models

The discussion centers around the misconception that advancements in large language models, such as ChatGPT, signify the imminent arrival of Artificial General Intelligence (AGI). It is highlighted that many proponents of this view do not fully grasp the inherent limitations of these AI systems. While large language models excel at mimicking human-like conversation, they lack true reasoning, creativity, and agency, which are essential components of AGI. The idea that AGI is just around the corner may stem from a lack of understanding about the complexities and capabilities required for true intelligence that extends beyond human-like chat.

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner