AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
How to Judge an AGI Based on the Output
There cannot be a specification of a program for AGI in the sense of saying what properties its output must have. There is no such thing as specifying what the proper output is for a given input, because an AGI may choose not to answer. It might choose never to answer or become a hermit. So if we can't judge an AGI based on the output, how can we judge it? Yes, we can'tJudge an AGI or a human. There can't be a reliable test of whether a human is thinking. What about the Turing test? The Turing test is something that has been invented after Turing. And unlike most titles which are questions, the answer was yes