Data Science at Home cover image

Full steam ahead! Unraveling Forward-Forward Neural Networks (Ep. 232)

Data Science at Home

00:00

The Relevance of Forward Forward Algorithms for Learning

The back propagation phase of a neural network makes the learning process much, much faster with respect to the little causing forward forward. So if forward forward algorithms do not perform, learn much, are much slower than back propagation neural networks, and within, and with the same computational effort, they learn less. The idea proposed by Geoffrey Hinton is that this would be no longer the case in the sense that if we wanted an energy efficient way to multiply the software's or weight matrices, the only possible way it would be to use some kind of what it defines as mortal computation.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app