LessWrong (30+ Karma) cover image

“Insofar As I Think LLMs ‘Don’t Really Understand Things’, What Do I Mean By That?” by johnswentworth

LessWrong (30+ Karma)

00:00

Scaling, Noticing, and Fixing Inconsistencies

John considers whether larger LLMs can assemble bigger consistent chunks and whether they notice and correct inconsistencies like humans.

Play episode from 05:00
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app