Dwarkesh Podcast cover image

Francois Chollet, Mike Knoop - LLMs won’t lead to AGI - $1,000,000 Prize to find true solution

Dwarkesh Podcast

00:00

Limitations of Language Models and the Need for Compression

Language models are limited in size and have a fixed number of parameters, forcing them to compress knowledge by storing reusable bits of programs. This compression leads to generalization as the models attempt to express new programs with existing knowledge. However, the intrinsic limitation lies in the model's parametric curve structure, restricting it to local generalization. To achieve broader generalization, a different model, such as discrete program search or program synthesis, is required.

Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner