
The future of computational linguistics
The Future of Everything
00:00
The Future of Hallucination
We can't go into the details of how these models are built, but they have seen incredible volumes of human generated text. They seem to have learned a lot more than we would have expected about even human relationships and how they work. So what is it not doing well that might not be in all the advertising material? It's commonly called hallucination. These models will just make stuff up. Now, you know, there are some humans that just makestuff up. We probably all. See the few of them.
Transcript
Play full episode