Theories of Everything with Curt Jaimungal cover image

David Chalmers: Are Large Language Models Conscious?

Theories of Everything with Curt Jaimungal

00:00

Exploring Consciousness and Language Models

This chapter investigates how large language models function without traditional memory structures and their relation to theories of consciousness. It discusses model architectures, including feed-forward and recurrent networks, while highlighting ongoing research into the presence of world models in these systems.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app