Connor: I think the word large language models is kind of a misnomer, or it's just like not a good term. These are general systems that can take in input from various modalities encoded to some kind of semantic space and do cognitive operations on it. This is the same way the human brain books stimuli into a common representation of neural spike trains. And similarly, what we're seeing with these GPT plugins and whatever is we're hooking up muscles to the neural spike trains of the of these language models. We are giving them actuators, virtual actuators upon reality. So this is interesting both for the way in which they can interact with the environment, but also how

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode