AXRP - the AI X-risk Research Podcast cover image

41 - Lee Sharkey on Attribution-based Parameter Decomposition

AXRP - the AI X-risk Research Podcast

00:00

Exploring Compressed Computation in Neural Networks

This chapter explores the concept of compressed computation in neural networks, emphasizing how these networks can compute more functions than available neurons. The discussion includes mechanics of ReLU functions, parameter decomposition, and the importance of managing input features to enhance computational efficiency. Additionally, the speakers reflect on the balance between representation and underlying mechanisms, revealing the need for further investigation and model refinement.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app