Education Bookcast cover image

147. Large language models (LLMs) - interview with Dr Guy Emerson

Education Bookcast

00:00

How to Overanthropomorphize a Language Model

The language models don't produce a reference typically to you know there's no set sense of like this is why they use the word hallucination. You can do a thing where you can like ask it to produce lines of books and you can ask it like what's the first line of George Orwell's 1984, for example. I think actually it's very tempting but very dangerous to anthropomorphize these systems so if we say oh ask you know ask the language model it makes it sound like you're talking to a person and I think that gives a very misleading intuition about what it's doing.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app