AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
The Problem With Transformers
Each token is then activates one neuron, right? You're asking a question, I'm not sure if I know the answer to because transformers are a bit trickier. A token either being there or not there will either activate or not activate. So it's like it's more complicated with transformers. And that can't be all of it because then if you put in the exact same prompt into different instances, you'd get the exact same output, but you don't.