AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
The Importance of Understanding the Alignment Problem
Chris Ola's team has been working on inter mechanistic interpretability, understanding what is going on inside the giant and scutable matrices of floating point numbers. Have they made enough progress? Well, you can try to quantify it by putting up a prediction market on weather in 2026. We will have understood anything that goes on inside a giant transformer net that was not known to us in 2006.