Chapters
Transcript
Episode notes
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29
Introduction
00:00 • 4min
The Functional Unit of the Brain Is Not the Noron
04:18 • 2min
The Laplacian Transform and the Inverse, What Is It?
06:25 • 2min
A New Thing, You Know?
08:35 • 2min
What Is a Laplace Transform?
10:58 • 2min
How Do Time Cells Work?
12:33 • 4min
A Laplace Transform as a Function of Time
16:45 • 2min
A Laplacian Transform in the Internal Cortex
18:54 • 3min
The Advantage of Having a Log Scale
21:49 • 3min
Is There a Memory Across the Brain?
25:15 • 4min
How Does This Relate to Episodic Memory?
29:25 • 3min
How to Make Episodic Memory
32:06 • 3min
Is It Possible to Store a Memory in the Cell?
35:12 • 2min
What's a Number?
37:11 • 3min
The Laplace Domain in a Recurrent Eral Network
40:31 • 2min
Is There a Log Distribution of Time Cuonsense?
42:55 • 2min
Is It Possible to Train Convolution Networks on a Problem?
44:59 • 3min
The Theory of Reinforcement Learning
48:05 • 3min
A, I Think the Transformers Are Like an Oronen?
51:00 • 3min
The Scaling Factors
54:02 • 2min
Is There a Solution Given by the Equations?
56:14 • 2min
How Long Does a Sequence Last?
58:15 • 2min
Ya and O Brad Wibles Su for Good
01:00:38 • 2min
A Fantastic Question From a Red Wible
01:02:24 • 2min
Is There a Re-Play in the Hippocampus?
01:04:17 • 4min
Complementary Learning Systems Theory and a Laplacian Mechanism
01:07:51 • 3min
The Laplace Model of Memory
01:10:57 • 4min
Do You Need to Be Able to Change More Quickly?
01:15:11 • 2min
The Deep Network Is a Theory of the Brain
01:17:05 • 3min