COMPLEXITY cover image

COMPLEXITY

Cris Moore on Algorithmic Justice & The Physics of Inference

Jan 15, 2021
01:11:40

It’s tempting to believe that people can outsource decisions to machines — that algorithms are objective, and it’s easier and fairer to dump the burden on them. But convenience conceals the complicated truth: when lives are made or broken by AI, we need transparency about the way we ask computers questions, and we need to understand what kinds of problems they’re not suited for. Sometimes we may be using the wrong models, and sometimes even great models fail when fed sparse or noisy data. Applying physics insights to the practical concerns of what an algorithm can and cannot do, scientists find points at which questions suddenly become unanswerable. Even with access to great data, not everything’s an optimization problem: there may be more than one right answer. Ultimately, it is crucial that we understand the limits of the technology we leverage to help us navigate our complex world — and the values that (often invisibly) determine how we use it.

Welcome to COMPLEXITY, the official podcast of the Santa Fe Institute. I’m your host, Michael Garfield, and every other week we’ll bring you with us for far-ranging conversations with our worldwide network of rigorous researchers developing new frameworks to explain the deepest mysteries of the universe.

We kick off 2021 with SFI Resident Professor Cristopher Moore, who has written over 150 papers at the boundary between physics and computer science, to talk about his work in the physics of inference and with The Algorithmic Justice Project.

If you value our research and communication efforts, please consider making a donation at santafe.edu/give — and/or rating and reviewing us at Apple Podcasts. You can find numerous other ways to engage with us at santafe.edu/engage. Thank you for listening!

Join our Facebook discussion group to meet like minds and talk about each episode.

Podcast theme music by Mitch Mignano.

Follow us on social media:
Twitter • YouTube • Facebook • Instagram • LinkedIn

Related Reading:

Cris Moore’s Google Scholar Page

The Algorithmic Justice Project

“The Computer Science and Physics of Community Detection: Landscapes, Phase Transitions, and Hardness"

The Ethical Algorithm by SFI External Professor Michael Kearns

“Prevalence-induced concept change in human judgment” co-authored by SFI External Professor Thalia Wheatley

“The Uncertainty Principle” with SFI Miller Scholar John Kaag

SFI External Professor Andreas Wagner on play as a form of noise generation that can knock an inference algorithm off false endpoints/local optima

Related Videos:

Cris Moore’s ICTS Turing Talks on “Complexities, phase transitions, and inference”

Fairness, Accountability, and Transparency:  Lessons from predictive models in criminal justice

Reckoning and Judgment  The Promise of AI

Easy, Hard, and Impossible Problems: The Limits of Computation. Ulam Memorial Lecture #1.

Data, Algorithms, Justice, and Fairness. Ulam Memorial Lecture #2.

Related Podcasts:

Fighting Hate Speech with AI & Social Science (with Joshua Garland, Mirta Galesic, and Keyan Ghazi-Zahedi)

Better Scientific Modeling for Ecological & Social Justice with David Krakauer (Transmission Series Ep. 7)

Embracing Complexity for Systemic Interventions with David Krakauer (Transmission Series Ep. 5)

Rajiv Sethi on Stereotypes, Crime, and The Pursuit of Justice

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode