
NEJM AI Grand Rounds From Hindsight Bias to Machine Bias: Dr. Laura Zwaan on Learning from Mistakes
Nov 19, 2025
Dr. Laura Zwaan, an associate professor at Erasmus University, specializes in cognitive psychology and the exploration of diagnostic errors. She examines how both humans and machines inherit biases, stressing the need for transparency in AI. Zwaan reveals the challenges of defining diagnostic errors due to hindsight bias and suggests that systematic errors, rather than individual blame, should be the focus of improvements. She warns that AI can replicate and amplify human biases, advocating for a deeper understanding of machine psychology to ensure safer clinical practices.
AI Snips
Chapters
Books
Transcript
Episode notes
Errors Depend On Time And Hindsight
- Diagnostic error labels change with time and hindsight, so defining an error is not simply right or wrong.
- Hindsight bias inflates perceived errors when reviewers know the outcome, skewing measurement and interpretation.
Triangulate To Judge Diagnostic Quality
- Use multiple methods and triangulation to judge diagnostic errors rather than relying on single retrospective reviews.
- Present case descriptions without the ending and ask experts if they would do care differently to reduce outcome bias.
Errors Are Systemic, Not Individual
- Medical errors usually emerge from teams and systems rather than a single clinician's act.
- Focusing on blame misdirects improvement efforts; continuity and transfers of care deserve more attention.

