Machine Learning Street Talk (MLST)

Jonas Hübotter (ETH) - Test Time Inference

65 snips
Dec 1, 2024
Jonas Hübotter, a PhD student at ETH Zurich specializing in machine learning, delves into his innovative research on test-time computation. He reveals how smaller models can achieve up to 30x efficiency over larger ones by strategically allocating resources during inference. Drawing parallels to Google Earth's dynamic resolution, he discusses the blend of inductive and transductive learning. Hübotter envisions future AI systems that adapt and learn continuously, advocating for hybrid deployment strategies that prioritize intelligent resource management.
Ask episode
AI Snips
Chapters
Books
Transcript
Episode notes
ANECDOTE

Google Earth Analogy

  • Tim Scarfe and Jonas Hübotter use Google Earth's variable resolution as an analogy for efficient computation.
  • Like Google Earth focuses resources when zooming in, AI can allocate more compute to complex tasks.
INSIGHT

Nearest Neighbor Limitations

  • Nearest neighbor search is insufficient for local learning as it prioritizes similarity over informativeness.
  • It retrieves redundant information, failing to synthesize diverse knowledge for complex tasks.
ADVICE

Improving Information Retrieval

  • Use a tractable surrogate model to estimate uncertainty and guide information retrieval.
  • Minimize uncertainty by selecting data points that are both relevant and non-redundant.
Get the Snipd Podcast app to discover more snips from this episode
Get the app