LessWrong (Curated & Popular) cover image

"An artificially structured argument for expecting AGI ruin" by Rob Bensinger

LessWrong (Curated & Popular)

00:00

The Evolution of Goal-Oriented Systems

Minds that try to steer towards specific world states are a relatively simple and obvious way to do sufficiently hard things. This is also empirically what happened when evolution built scientific reasoners. In so far as we can think of evolution as an optimization process, it was neither optimizing for build goal-oriented systems nor for build stem workers. Our cognitive generality and goal-oriented optimization then resulted in us becoming good at stem further down the road with no additional evolutionary optimization of our brains for stem.

Play episode from 17:47
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app