AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
The Problem With Emergent Deception in AGI Communications
Communication between agents, organic and computational is key for alignment. One way of ensuring that these conflicts are limited is for one agent or a group of agents to deceive other agents so that they never get the chance to escalate the conflicts further on. This is one issue that I believe needs to be resolved before we can fully rely on communications between agents as a solution approach to the alignment problem.