Brain Inspired

BI 139 Marc Howard: Compressed Time and Memory

4 snips
Jun 20, 2022
Ask episode
Chapters
Transcript
Episode notes
1
Introduction
00:00 • 4min
2
The Functional Unit of the Brain Is Not the Noron
04:18 • 2min
3
The Laplacian Transform and the Inverse, What Is It?
06:25 • 2min
4
A New Thing, You Know?
08:35 • 2min
5
What Is a Laplace Transform?
10:58 • 2min
6
How Do Time Cells Work?
12:33 • 4min
7
A Laplace Transform as a Function of Time
16:45 • 2min
8
A Laplacian Transform in the Internal Cortex
18:54 • 3min
9
The Advantage of Having a Log Scale
21:49 • 3min
10
Is There a Memory Across the Brain?
25:15 • 4min
11
How Does This Relate to Episodic Memory?
29:25 • 3min
12
How to Make Episodic Memory
32:06 • 3min
13
Is It Possible to Store a Memory in the Cell?
35:12 • 2min
14
What's a Number?
37:11 • 3min
15
The Laplace Domain in a Recurrent Eral Network
40:31 • 2min
16
Is There a Log Distribution of Time Cuonsense?
42:55 • 2min
17
Is It Possible to Train Convolution Networks on a Problem?
44:59 • 3min
18
The Theory of Reinforcement Learning
48:05 • 3min
19
A, I Think the Transformers Are Like an Oronen?
51:00 • 3min
20
The Scaling Factors
54:02 • 2min
21
Is There a Solution Given by the Equations?
56:14 • 2min
22
How Long Does a Sequence Last?
58:15 • 2min
23
Ya and O Brad Wibles Su for Good
01:00:38 • 2min
24
A Fantastic Question From a Red Wible
01:02:24 • 2min
25
Is There a Re-Play in the Hippocampus?
01:04:17 • 4min
26
Complementary Learning Systems Theory and a Laplacian Mechanism
01:07:51 • 3min
27
The Laplace Model of Memory
01:10:57 • 4min
28
Do You Need to Be Able to Change More Quickly?
01:15:11 • 2min
29
The Deep Network Is a Theory of the Brain
01:17:05 • 3min