
Anonymous advice: If you want to reduce AI risk, should you take roles that advance AI capabilities? (Article)
80k After Hours
00:00
The Net Effect of Capabilities' Work on AGI
The simple take of capabilities' work bringing AGI closer, which is bad because of X-Risk, is probably directionally correct on average. Capabilities work can have an indirect effect on AGI timelines by encouraging others to either A, invest more in AGI bottleneck capabilities work, or B, spend more on training large models leading to an accelerated spending timeline that eventually results in AGI. Make sure your work isn't contributing to humanity's destruction of humanity, or don't work. Try not to fall for paper-thin excuses about far-flung dreams of alignment relevance,. Either way, you're going to be working with people who are trying to make a living out of AI
Transcript
Play full episode