
Full steam ahead! Unraveling Forward-Forward Neural Networks (Ep. 232)
Data Science at Home
00:00
The Problem With Back Propagation
There is something very weird to speak about, but in fact it's hardware that implements the software. This means that when that hardware dies, that software dies as well. An energy efficient way to multiply for example, an activity vector by a weight matrix would be to implement activities as voltages and weights as conductances. The use of two forward passes would not need any analog to digital converters at all. And so this is where, according to Hinton, we would gain from an algorithm that performs less when it comes to accuracy of prediction or less generative power if you are dealing with generative models.
Transcript
Play full episode