
"Two-year update on my personal AI timelines" by Ajeya Cotra
LessWrong (Curated & Popular)
Is There Evidence for Efficient Meta Learning?
I haven't yet seen evidence that language models can be taught new skills they definitely didn't know already over the course of many rounds of back and forth. I've also seen efficient zero cited as evidence that SGD itself can reach human level sample complexities without the need for explicit meta learning. But this doesn't seem right to me. The relevant comparison is how much time it would take a human to get what's going on and what they're supposed to aim for. And the latter is not something that it'd take a human two hours of watching to figure out. Probably more like 15 to 60 seconds.
00:00
Transcript
Play full episode
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.