Complexity scientist David Wolpert joins me to consider the idea of meaning at its most fundamental level. Historically, information theory has helped us quantify information (e.g., bits), but says nothing about the ways information might be useful, significant, relevant, or meaningful. Recently, however, Wolpert and colleagues have filled in what's missing from that account, offering a theory of "semantic" or "meaningful" information by showing how some information actually has causal power to influence the well-being and viability of systems in context. Here we explore this idea and a number of its implications for what's "meaningful" across the complexity stack, from a whirlpool to a bacterium all the way up to us.
0:00 Introduction
0:46 Meaning and Semantic Information
2:17 Background Context: Information Theory, Utility Functions, and Statistical Thermodynamics
14:03 Meaning FOR a System: What Information Helps One Stay Far from Equilibrium
21:54 Meaning: Mutual Information with Causal Power for Viability
27:57 Meaning and Meaurement up the Complexity Stack
33:42 Indirect Meaning, Chains of Significance, and Intelligence
37:20 A Semantic Information Theory of Individuality?
42:03 Relative vs. Absolute Semantic Information Metrics
49:52 The Complexification of Meaningful Information through Evolutionary Transitions
52:30 Layered Meaning through Evolution