Brain Inspired

BI NMA 05: NLP and Generative Models Panel

Aug 13, 2021
Ask episode
Chapters
Transcript
Episode notes
1
Introduction
00:00 • 6min
2
Is There a Time Scale of Attention?
05:53 • 2min
3
Are You Learning With Deep Learning Begrudgingly?
07:36 • 6min
4
Conversational Agents - What Are They?
13:34 • 3min
5
How Do You Evaluate Your Wedding Tables?
17:01 • 2min
6
Deep Learning - Is That the Story?
19:29 • 4min
7
Is There a Barrier to Deep Learning?
23:17 • 6min
8
Then I Couldn't Get the Algorithm Correct.
29:05 • 2min
9
How Variational Auto Incoters Can Build Good Latent Spaces
31:15 • 3min
10
Is There a Way to Generate a Red Two?
34:39 • 3min
11
Is the Model Automatica Learning a Data Driven Architecture?
37:19 • 3min
12
Is That Genetically Preordained?
40:00 • 3min
13
Working With Language, You Know?
43:29 • 4min
14
What's the Right Level of Abstraction?
47:02 • 4min
15
Can You Really Compute the Crate in the Whiteford?
51:29 • 3min
16
The Conversational Aspect of Language
54:19 • 4min
17
Is Language the Key to Vision?
58:25 • 6min
18
Is Attention a Controlled Process?
01:03:56 • 2min
19
Is That a Leap Through?
01:05:49 • 2min
20
Attention Is a Necessity in Computer Science
01:07:50 • 4min
21
Is There a Controllable Attention Model?
01:12:03 • 5min
22
I'm Struggling to Convince Students to Do Something Besides an L P
01:16:48 • 2min
23
Are We at the Beginning of Natural Language Understanding?
01:18:53 • 3min
24
The Music You Hear Is by the New Year.
01:22:10 • 2min