LessWrong (Curated & Popular) cover image

'Simulators' by Janus

LessWrong (Curated & Popular)

00:00

GPT

GPT-3 doesn't consistently try to say true or correct things. It may give answers if the prompt asks questions, but it may also simply elaborate on the prompt without answering any question. Spouting falsehoods in some circumstances is incentivised by GPT's outer objective. Being realistic means predicting humans faithfully even when they are likely to be wrong. That said, GPT does store a vast amount of knowledge and its courage ability allows it to be cajoled into acting as an oracle,. like it can be cajoling into acting like an agent.

Play episode from 44:42
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app