Machine Learning Street Talk (MLST) cover image

Machine Learning Street Talk (MLST)

Jonas Hübotter (ETH) - Test Time Inference

Dec 1, 2024
Jonas Hübotter, a PhD student at ETH Zurich specializing in machine learning, delves into his innovative research on test-time computation. He reveals how smaller models can achieve up to 30x efficiency over larger ones by strategically allocating resources during inference. Drawing parallels to Google Earth's dynamic resolution, he discusses the blend of inductive and transductive learning. Hübotter envisions future AI systems that adapt and learn continuously, advocating for hybrid deployment strategies that prioritize intelligent resource management.
01:45:56

Episode guests

Podcast summary created with Snipd AI

Quick takeaways

  • Jonas Hübotter's research reveals that smaller models can outperform larger ones by utilizing efficient test-time computation techniques.
  • The podcast emphasizes the potential of hybrid deployment strategies that blend local and cloud computation based on task complexity.

Deep dives

Data Conflicts in Context Learning

Conflicting information between context data and pre-trained models can lead to unpredictable outcomes in AI systems. It is crucial to design intelligent systems that can adapt to their environments while meeting fundamental objectives. These systems should navigate retrieval processes by considering the relationships between data points, effectively creating a framework for enhanced predictions. This approach emphasizes the need for a structured understanding of how data interactions influence model learning.

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner