AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Language Modeling - A Massive Multitask Problem
I would like to see the I community get to is to start sharing a substrate a neural substrate, some kind of very large general function approximator likely transformer right now. And then actually continue training that one model to get better and better over time at more and more different tasks. Then we stopped sort of restarting all our projects one at a time, and instead are a glomative cumulative similarly in a similar way as we are with intellectual ideas and papers,.