

Geoff Hinton on revolutionizing artificial intelligence... again
28 snips Jun 1, 2022
Chapters
Transcript
Episode notes
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
Introduction
00:00 • 3min
What Are Nemets? And Why Should We Care?
03:09 • 2min
Is Back Propagation Better Than What the Brain Is Doing?
05:37 • 2min
Is the Brain Using Local Objective Functions?
07:19 • 2min
The Simclear Paper and Contrastive Learning
09:36 • 2min
Is Back Propagation a Good Mechanism for Perceptual Learning?
11:20 • 3min
Is There a Way to Distilate Knowledge From One Location to Another?
14:14 • 3min
Is It Just an Engineering Difference?
17:26 • 6min
Is the Retina a Spiking Neuron?
23:24 • 4min
The Importance of Spiking Youromets
27:29 • 6min
Using Convolutional Nets to a Big Data Set
32:59 • 2min
What Triggered Id?
34:34 • 4min
The End of My Principles
38:23 • 3min
How Do You Go From a Computer Science Degree to a Carpenter?
40:58 • 3min
How to Do Recursion With Neural Networks
44:27 • 4min
How Did Churing and Von Noyman Die Early?
48:10 • 2min
Deep Learning, Is It All We Need?
50:20 • 5min
What Do You Think About Large Language Models?
55:45 • 4min
Are You Getting It Right?
01:00:13 • 2min
Sleep Is a Computing Function
01:02:11 • 2min
Using Contrastive Learning, You Can Separate the Positive and Negative Phases
01:04:39 • 5min
Can You Learn From That?
01:09:09 • 4min
How to Get the Right Labels in the Classroom
01:13:16 • 2min
Nea Net Learning
01:14:55 • 5min
Using Tisny to Find the Distances of a Galaxies
01:19:39 • 3min