Education Bookcast cover image

147. Large language models (LLMs) - interview with Dr Guy Emerson

Education Bookcast

CHAPTER

How to Overanthropomorphize a Language Model

The language models don't produce a reference typically to you know there's no set sense of like this is why they use the word hallucination. You can do a thing where you can like ask it to produce lines of books and you can ask it like what's the first line of George Orwell's 1984, for example. I think actually it's very tempting but very dangerous to anthropomorphize these systems so if we say oh ask you know ask the language model it makes it sound like you're talking to a person and I think that gives a very misleading intuition about what it's doing.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner