

EP 192 David Krakauer on Science, Complexity and AI
01:35:32
Prediction Vs Understanding Bifurcation
- David Krakauer says science is bifurcating into prediction (large models) and understanding (coarse-grained theory).
- This split may produce two parallel sciences for the same topics.
AlphaFold's Practical Breakthrough
- Jim Rutt recounts AlphaFold's huge practical success solving protein folding.
- He notes it offered little theoretical insight initially.
Neural Nets Originated As Deductive Models
- David Krakauer explains early neural nets were born from deductive logic, not induction.
- McCulloch and Pitts framed them as propositional, epistemological systems.
Get the Snipd Podcast app to discover more snips from this episode
Get the app 1 chevron_right 2 chevron_right 3 chevron_right 4 chevron_right 5 chevron_right 6 chevron_right 7 chevron_right 8 chevron_right 9 chevron_right 10 chevron_right 11 chevron_right 12 chevron_right 13 chevron_right 14 chevron_right 15 chevron_right 16 chevron_right 17 chevron_right 18 chevron_right 19 chevron_right 20 chevron_right 21 chevron_right 22 chevron_right 23 chevron_right 24 chevron_right 25 chevron_right 26 chevron_right 27 chevron_right 28 chevron_right 29 chevron_right 30 chevron_right 31 chevron_right 32 chevron_right 33 chevron_right 34 chevron_right 35 chevron_right
Introduction
00:00 • 2min
The Bifurcation of Theory and Data
02:22 • 4min
The Origins of Neural Networks
06:21 • 3min
The History of Neural Nets
09:31 • 2min
The Importance of Regularities in Complex Domains
11:53 • 3min
The Uncanny Valley of Complexity
15:12 • 2min
The Pros and Cons of Adaptive Computation
17:18 • 2min
The Concilience of Traditional Linguists and Language Understanding
19:45 • 2min
The Poverty of the Stimulus
21:33 • 4min
How to Use Deep Neural Networks for Parsimonious Science
25:38 • 3min
The Importance of Symbolic Regression in Astronomy
28:26 • 2min
The New Way of Doing Science
30:31 • 2min
The Importance of Knowledge in Model Building
32:12 • 2min
The Cognitive Synergy of Deep Learning
34:23 • 2min
The Evolution of Human Intelligence
36:37 • 3min
The Importance of Complexity in Games
39:18 • 2min
The Power of Neural Nets in Complexity Science
41:35 • 2min
The History of Science and Technology
43:47 • 3min
The Importance of Constructs in Economics
46:45 • 2min
The Complexity of Adaptive History
48:36 • 2min
The Importance of Structure in Neural Nets
51:05 • 3min
The Importance of Creativity in the Movie Writing Process
54:26 • 5min
The Importance of Bandwidth Limitations in Scientific Revolutions
59:23 • 2min
The Polyannas of Machine Learning
01:00:56 • 2min
The Importance of Intelligence
01:02:59 • 4min
Darwin's Dangerous Idea
01:07:15 • 2min
The Paradox of Meta-Oaken
01:09:16 • 3min
The Paradox of Evolutionary Time
01:12:40 • 2min
Complexity Science and Physical Theory Substitute Parsimony
01:14:35 • 2min
The Existential Risk of Models
01:16:08 • 4min
The Risks of Narrow AI
01:19:47 • 2min
The History of Genetic Engineering
01:22:16 • 2min
The Future of Technology
01:24:01 • 6min
The Future of Social Media
01:29:40 • 2min
The Problem With Outsourcing Human Judgment
01:32:04 • 3min
Jim has a wide-ranging talk with David Krakauer about the ideas in his forthcoming paper "The Structure of Complexity in Machine Learning Science" and how AI may alter the course of science. They discuss data-driven science vs theory-driven science, a bifurcation in science, the protein folding problem, brute force methods, the origin of induction in David Hume, the origin of neural networks in deductive thinking of the '40s, super-Humean models, crossing the statistical uncanny valley, ultra-high-dimensionality, adaptive computation, why genetic algorithms might come back, Chomsky's poverty of the stimulus, the lottery ticket hypothesis, neural nets as pre-processors for parsimonious science, how human expertise constrains model-building, GPT-4's arithmetic problem, cognitive synergy, why LLMs aren't AGIs, incompressible representations, gravitational lensing, the new sciences LLMs will lead to, encoding adaptive history, Jim's ScriptWriter software, discovery engines vs libraries vs synthesizers, the history of science as a history of constraint, Occam's razor & meta-Occam, assembly theory, whether existential risk is a marketing ploy, the Idiocracy risk, using empirical precedent in tech regulation, networks of info agents, the outsourcing of human judgment, and much more.
Episode Transcript
JRS EP10 - David Krakauer: Complexity Science
Darwin's Dangerous Idea: Evolution and the Meanings of Life, by Daniel Dennett
JRS Currents 100: Sara Walker and Lee Cronin on Time as an Object
David Krakauer's research explores the evolution of intelligence and stupidity on Earth. This includes studying the evolution of genetic, neural, linguistic, social, and cultural mechanisms supporting memory and information processing, and exploring their shared properties. President of the Santa Fe Institute since 2015, he served previously as the founding director of the Wisconsin Institutes for Discovery, the co-director of the Center for Complexity and Collective Computation, and professor of mathematical genetics, all at the University of Wisconsin, Madison.