
The Great A.I. Hallucination
The Politics of Everything
The Importance of Compression Artifacts in Search Engines
The hallucinations that these models are prone to are a kind of compression artifact where the program is giving its best guess, it's sort of extrapolating because it doesn't have the actual answer. A search engine will return things like no results found. But Chachie P.T. never is at a loss for an answer. It has to transform it in some way and generate new text to try and give you an answer. That new text may have errors in it. And somehow the wrongness is essential to this form, at least of fake originality.
00:00
Transcript
Play full episode
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.