i think both questions should be answered when we have this large computes and large models as experiments are there's a lot of surprises in these you know systems right like in some sense it's all a giant surprise that transformers have scaled as well as they have. i will say if it's a surprise at least emergent behavior like yeah most of time you know just let us get surprised but one thing we hope that it can better at is actually called a knack it's less talk about but  hopeful to see more data model can learn that on the other side of in-context learningyou shouldn't read it to be worried because it's all showing in the publicity, he says

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode