AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
The Importance of Compression Artifacts in Search Engines
The hallucinations that these models are prone to are a kind of compression artifact where the program is giving its best guess, it's sort of extrapolating because it doesn't have the actual answer. A search engine will return things like no results found. But Chachie P.T. never is at a loss for an answer. It has to transform it in some way and generate new text to try and give you an answer. That new text may have errors in it. And somehow the wrongness is essential to this form, at least of fake originality.