AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Weak Supervision vs active learning
The idea is that as long as you've sampled the data in a way where it's represented of the larger whole, then the functions you've created are still representative. And then it doesn't really matter how much data you're passing into the system. You can imagine that like weak supervision as a side effect, is very, very good at labelling for, like, the head of a distribution where there might be a lot of commonality. But as you get further and further into the long tail, it becomes harder and harder.