
"Why I think strong general AI is coming soon" by Porby
LessWrong (Curated & Popular)
00:00
The Future of Machine Learning
An attention free RNN can apparently match transformers at similar scales. Transformers appear to have taken off not because they are uniquely capable, but rather because they came relatively early and were easy to train in a parallelizable way. If it turns out that there are many paths to current levels of capability or beyond, as it looks like will be the case, it's much harder for machine learning progress to stall soon enough to matter. One research path might die, but another five take its place.
Play episode from 15:52
Transcript


