Overthink cover image

AI Chatbots

Overthink

00:00

Are 'Hallucinations' the Right Term for LLM Errors?

The hosts argue that hallucination misleads because LLMs predict text without perception, so invented citations are intrinsic to design.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app