Tradeoffs

Lots of Hospitals Are Using AI. Few Are Testing For Bias

12 snips
Feb 27, 2025
In this discussion, Paige Nong, an Assistant Professor at the University of Minnesota specializing in AI's influence on healthcare, reveals the current landscape of AI use in hospitals. She highlights the concerning lack of bias testing in predictive algorithms, particularly those affecting marginalized patients. The conversation emphasizes the urgent need for consistent governance to ensure equitable treatment. Nong also addresses challenges faced by safety net hospitals and calls for robust evaluations of AI tools to enhance patient experiences and support health equity.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

AI Governance Research

  • Paige Nong, inspired by a 2019 paper, studies AI governance in healthcare.
  • She aims to understand how health systems prevent the use of biased AI.
ANECDOTE

Biased Algorithm Example

  • A 2019 study revealed an algorithm used cost, not diagnosis, to predict patient needs.
  • This disadvantaged Black patients, who often spend less on healthcare due to access disparities.
INSIGHT

Inconsistent AI Governance

  • Interviews with 13 academic medical centers revealed inconsistent AI governance.
  • Only four centers considered equity in their AI governance processes, raising bias concerns.
Get the Snipd Podcast app to discover more snips from this episode
Get the app