Robert Wright's Nonzero

How Does AI Work? (Robert Wright & Timothy Nguyen)

Jul 4, 2023
Ask episode
Chapters
Transcript
Episode notes
1
Introduction
00:00 • 2min
2
The Importance of Deep Learning in Machine Learning
01:36 • 3min
3
The Reverse Engineering of Language Models
04:35 • 4min
4
The Importance of Understanding in Language Models
08:22 • 3min
5
The Godfather of AI
11:30 • 4min
6
The Importance of Vectors in Word to VAC
15:05 • 2min
7
The Effect of Vectors on Semantics
16:47 • 2min
8
Vector Analysis for Word Associations
19:11 • 2min
9
The Transformer Architecture for Large Language Model Success
21:06 • 2min
10
The Complexity of Large Language Models
23:35 • 2min
11
The Evolution of Neural Networks
25:56 • 5min
12
The History of Image Recognition
30:55 • 3min
13
How to Train Machine Learning Models
34:07 • 3min
14
How Chat GPT and Bard Were Trained?
36:56 • 2min
15
The Mystery of Deep Learning
38:33 • 2min
16
The Failure Modes of Machine Learning
40:41 • 2min
17
The Importance of Context in Machine Learning
42:35 • 2min
18
How to Calculate the Value of a Home in Terms of Various Properties
45:00 • 3min
19
How to Model Real World Phenomenons
47:40 • 2min
20
The Importance of Vectors in Words
49:34 • 4min
21
The Importance of Bite Pair Encoding in Word Recognition
53:26 • 3min
22
The Importance of Attention in Language Models
56:30 • 3min
23
The Underlying Mechanism of Attention
59:16 • 3min
24
The Hierarchical Understanding of Words
01:02:24 • 2min
25
Transformer's Attention Mechanism
01:04:00 • 2min
26
The Importance of Context Dependent Word Embedding
01:05:45 • 3min
27
Attention Is All You Need
01:08:44 • 2min
28
How to Be a Successful Podcaster
01:10:52 • 2min