The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence) cover image

The Fallacy of "Ground Truth" with Shayan Mohanty - #576

The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)

00:00

Weak Supervision vs active learning

The idea is that as long as you've sampled the data in a way where it's represented of the larger whole, then the functions you've created are still representative. And then it doesn't really matter how much data you're passing into the system. You can imagine that like weak supervision as a side effect, is very, very good at labelling for, like, the head of a distribution where there might be a lot of commonality. But as you get further and further into the long tail, it becomes harder and harder.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app