The podcast explores the use of A.I. tools in medical rationing and how they obscure moral and ideological actions. It discusses the relationship between AI, healthcare, and settler colonialism, focusing on the genocide in Palestine. The speakers analyze United Health's algorithmic denial of care and the limitations of algorithms in predicting recovery time. They also discuss the subjectivity of reality, the rise of value-based care, and the mechanism of avoiding blame in the political economy. The chapter ends with a discussion on supporting the show and promoting books.
Automated AI algorithms like Gospel used by the Israeli army reveal the danger of delegating moral responsibility and the potential for displacement of accountability.
United Health's NH Predict algorithm in Medicare Advantage plans prioritizes financial goals over quality care, raising concerns about profit-driven decision making in the healthcare industry.
Algorithmic decision-making processes in healthcare reflect existing biases and inequalities, highlighting the need to address the underlying social and economic factors that shape these systems.
Deep dives
The Role of AI in Gaza Bombings
The Israeli army developed an AI algorithm called Gospel to accelerate target creation for strikes in Gaza. The algorithm analyzes drone footage, intercepted communications, and surveillance data to identify targets. It generated 100 targets per day during the latest escalation of the conflict, compared to 50 targets per year previously. The AI system claims to prioritize attacks on infrastructure associated with Hamas, aiming to minimize harm to non-combatants. However, this reveals the danger of delegating moral responsibility to automated systems and the potential for displacement of accountability.
Algorithmic Decision Making in Medicare Advantage
United Health, a private insurer, employed algorithms to manage Medicare Advantage plans and reduce nursing home care costs. The algorithm, NH Predict, aimed to keep rehab stays of patients within 1% of the algorithm's projections. Employees were penalized for failing to meet the algorithm's target, regardless of whether additional days were justified by Medicare coverage rules. This algorithmic approach raised concerns about the quality of patient care. The case highlights a larger issue of profit-driven decision making in the healthcare industry, where optimizing financial goals takes precedence over ensuring quality care.
Technical Fixes Do Not Address Core Issues
Debates around algorithmic systems often focus on technical fixes, such as auditing algorithms or requiring human oversight. However, the core problem lies in the social and economic forces that shape these systems' design and usage. Algorithms like Navi Health's NH Predict are rooted in profit-driven privatization of Medicare. The goals of private insurance companies revolve around managing risk and preventing waste rather than providing comprehensive healthcare. Attempts to adjust algorithms or regulations without challenging the underlying profit-driven healthcare system only offer surface-level solutions.
The Fallacy of Algorithmic Neutrality
Algorithmic decision-making processes are often presented as neutral tools that can optimize outcomes. However, algorithms are not inherently unbiased or autonomous. They are a product of human design and reflect the values, biases, and contextual factors embedded in their creation. Algorithms used in healthcare and other sectors reproduce inequalities and social conditions because they learn from data that reflects existing biases and inequalities. It is crucial to recognize that algorithms have limitations and responsibilities should not be displaced onto them. The focus should be on addressing the underlying social and economic factors that shape algorithmic systems.
The Illusion of Choice and the Myth of Rationing in Algorithmic Care Denials
The podcast explores how algorithms in healthcare perpetuate the illusion of choice and the myth of rationing in care denials. The discussion highlights the use of algorithms in determining access to healthcare and the potential biases and limitations inherent in these algorithms. It also delves into the historical context of rationing, drawing parallels between past practices and contemporary algorithmic decision-making. The podcast raises concerns about the offloading of moral responsibility onto algorithms and the implications for healthcare inequality.
The Intersection of Healthcare and Statistics
The podcast examines the connection between healthcare decision-making and statistical tools. It explores how statistics, stemming from Malthusian and eugenicist ideologies, influence the design and implementation of algorithms in healthcare. The discussion emphasizes that these statistical tools form the internal architecture of algorithms, despite the perception that algorithms are cutting-edge technologies. The podcast also advocates for transparency in algorithmic healthcare systems, highlighting the need to understand the inputs, training, and internal structures of these algorithms.
Beatrice, Abby, and Phil discuss how A.I. tools are used to lend a veneer of objectivity to medical rationing, and how tools like “Gospel” A.I. and UnitedHealth’s “nH Predict” are used to obscure inherently moral, political, and ideological actions.
Transcript forthcoming.
Find our book Health Communism here: www.versobooks.com/books/4081-health-communism
Pre-order Jules' new book here:
https://www.penguinrandomhouse.com/books/733966/a-short-history-of-trans-misogyny-by-jules-gill-peterson/
Death Panel merch here (patrons get a discount code): www.deathpanel.net/merch
As always, support Death Panel at www.patreon.com/deathpanelpod
Referenced in this episode:
Osama Tanous, "You, as of Now, Are Someone Else!" — https://www.palestine-studies.org/en/node/1653909
"Who Shall Live?" (1965) — https://youtu.be/FMay5zw1loA?si=U49FoeGxe_CmIbrp
LIFE Magazine article — https://bit.ly/48dDGte
Get the Snipd podcast app
Unlock the knowledge in podcasts with the podcast player of the future.
AI-powered podcast player
Listen to all your favourite podcasts with AI-powered features
Discover highlights
Listen to the best highlights from the podcasts you love and dive into the full episode
Save any moment
Hear something you like? Tap your headphones to save it with AI-generated key takeaways
Share & Export
Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more
AI-powered podcast player
Listen to all your favourite podcasts with AI-powered features
Discover highlights
Listen to the best highlights from the podcasts you love and dive into the full episode