Chapters
Transcript
Episode notes
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19
Introduction
00:00 • 2min
Eliezer Ukowski - I'm a Decision Theorist
01:56 • 2min
The Bull's Eye for Artificial Intelligence
03:43 • 4min
Is There a Difference Between Humans and Mice in General Optimization?
07:27 • 3min
The Frontiers of Strangeness in Artificial Intelligence
10:07 • 3min
Alpha Zero's Go Engine Is Better Than Any Dedicated Chess Engine Ever
13:05 • 2min
Is There a Human Level a I?
14:54 • 4min
Is There a Continuum of Intelligence?
18:37 • 3min
The Alignment Problem
22:02 • 2min
The Problem Isn't That Paper Clip Factor
23:33 • 3min
The Alignment Problem - You Can't Bring the Coffee if You're Dead
26:31 • 3min
The Paper Clip Maximizer
29:12 • 4min
The Orthagonality Thesis
33:27 • 4min
The Paper Clip Maximizer Is Not Orthognal
37:10 • 2min
Is There a Relationship Between Intelligence and Values?
38:59 • 3min
Is the Paper Clip Maximizer Too Special?
42:23 • 3min
I've Been Describing It as a Navigation Problem
44:54 • 2min
The Alignment Problem
46:24 • 4min
The Aim in a Box Thought Experiment
50:23 • 5min