The Thesis Review

[27] Danqi Chen - Neural Reading Comprehension and Beyond

Jul 2, 2021
Ask episode
Chapters
Transcript
Episode notes
1
Introduction
00:00 • 3min
2
The Future of Machine Learning
03:24 • 2min
3
The Benefits of Learning From Past Experiences
05:10 • 2min
4
How I Learned to Code
06:42 • 2min
5
How I Decided to Go to Graduate School
08:22 • 2min
6
The Evolution of Parsing
09:53 • 2min
7
The Importance of Dependency Parsing
12:02 • 2min
8
The Importance of Reading Completion
13:43 • 2min
9
Neural Reading Completion and Beyond
15:16 • 2min
10
The Future of Machine Completion
17:13 • 1min
11
The MC Test: A Full Package
18:40 • 1min
12
The Evolution of the MC Test
20:09 • 2min
13
The Challenge of Defining the Right Problem
22:13 • 2min
14
How Dr. QA Is Building a System That Can Answer Open Domain Questions
24:22 • 2min
15
The Stanford Attentive Reader: A Novel Approach to Reading Comprehension
25:58 • 2min
16
The Interaction Between Memory Networks and Neural Reading Comprehension Models
27:43 • 2min
17
The Importance of Reading Completion Models
29:52 • 2min
18
The Benefits of Pre-Chalang With Models for Question Answering
32:05 • 2min
19
The Benefits of Retrieval-Based Models for Question Answering
34:00 • 3min
20
The Benefits of Generic Datasets for Reasoning
36:35 • 2min
21
The Challenges of Generalization in Large Scale Neural Sequence Models
38:20 • 2min
22
Conversational Question or Co-Currency
39:52 • 2min
23
The Importance of Question Answering and Dialogue in Dialogue Systems
41:34 • 2min
24
The Importance of Efficiency in NLP Systems
43:21 • 3min
25
How to Build a Stronger Natural Language Understanding Systems
46:12 • 2min
26
How to Pick New Problems for Research
47:56 • 2min
27
How to Optimize for Career Aspects in Your PhD
50:08 • 2min
28
How to Slow Down and Carve More of a Major Path
52:03 • 4min