AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
The Future of Language Models
The trend towards bigger is better, may still pay off. It's hard for me to believe that data won't be like some sort of moat. With the rise of foundation models, it's less of a moat because everyone has access to common crawl. The papers themselves don't actually use that much data. So anyone I think could just go collect that.