The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence) cover image

The Fallacy of "Ground Truth" with Shayan Mohanty - #576

The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)

NOTE

Weak Supervision vs active learning

The idea is that as long as you've sampled the data in a way where it's represented of the larger whole, then the functions you've created are still representative. And then it doesn't really matter how much data you're passing into the system. You can imagine that like weak supervision as a side effect, is very, very good at labelling for, like, the head of a distribution where there might be a lot of commonality. But as you get further and further into the long tail, it becomes harder and harder.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner