Manifold cover image

Artificial Intelligence & Large Language Models: Oxford Lecture — #35

Manifold

CHAPTER

The Hallucination of the Models

This is an example of it generating plausible text, but not actually factually accurate. And that phenomenon is now by people in this field, it's called hallucination of the models. The earlier models are not as good at GPT-4, which is the latest and greatest still has a hallucination rate of order 20%, 20 to 30%. So it will, if you ask it a detailed technical question, you say, like, tell me the five greatest papers that Freeman Dyson wrote, it could easily make up three out of five papers,. But it will look very plausible, maybe to a not.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner