AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Do We End Up Persistently Building Better Transformers?
I think more pe working on something equals more creativity, potentially. We still have specialized am parts of our body asthe do do certain functions, not just the brain. There 's a lot of limits that really mean that, you know, many, many task like you can do out of complete an for code, but you can only do it for one function. You can't write a whole file base on some speck because that's bigger than the output. And the context window is limited, so you can't put everything, but all the inflow into the brain of a transformer. So i think we'll see new things in our extensions of er, probably combinations of existing ideas