AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Compression of a Large Volume of Text
The question is whether being forced to compress a very large amount of text. Forces you to construct things that are very much like, because the ideas of concepts and meaning is a spectrum. In order to form that kind of compression, maybe it will be forced to figure out the abstractions which look awfully a lot like common sense as world models. Is that possible? No, I don't think it is possible because the information is not there. The information is there behind the text - unless somebody has written all the details about how everything works in the world. That's an argument that like text is a lot lower fidelity than the experience of our physical world.