Guest Charles B. Perrow discusses the inevitability of catastrophic accidents in complex systems. Key points include failures in unpredictable ways, bias against nuclear power, and the importance of operator response in disasters. Perrow's theory predicts multiple failures leading to a 'perfect storm' that is hard to prevent, highlighting the limitations of better technology in averting major accidents.
Catastrophic accidents are inevitable in complex systems due to unforeseen failures and interactions.
Minor failures in complex systems can cascade and lead to major accidents through unforeseen interactions.
Organizations in high-risk industries face challenges in balancing central control and operator independence.
Preventing accidents in complex systems requires addressing systemic interactions beyond relying solely on technological advancements.
Deep dives
Operator Error and Unforeseen Situations
In complex systems, accidents stem from unexpected situations where operators may take reasonable actions given the information available, even if they lead to accidents. Perrow criticizes the attribution of accidents solely to operator error and highlights the importance of understanding the interactions of individual failures in such systems.
Cascade Effect in Complex Systems
Perrow emphasizes how minor failures in complex systems can combine in unforeseen ways, leading to accidents. He introduces the concept of a cascade effect, where seemingly trivial mishaps or failures interact and escalate, resulting in major incidents.
Organizational Control and Deception
Perrow discusses the challenges of organizational control in high-risk industries, emphasizing the dual dilemma organizations face in trying to centralize control while requiring independent and creative actions from operators. He critiques deceptive practices and negligence within organizations and highlights the risks posed by centralizing decision-making in tightly coupled systems.
Limitation of Better Technology as a Solution
Perrow argues that the solution to avoiding accidents does not solely lie in better technology, debunking the belief that technological advancements can mitigate all risks. He challenges the notion that improved technologies alone can prevent accidents in complex systems, emphasizing the need to address systemic interactions rather than rely solely on technological enhancements.
Focus on Power Dynamics Over Risk Assessment
Perrow shifts the focus from risk assessment to power dynamics in managing high-risk technologies, cautioning against allowing risk professionals to dictate safety strategies. He highlights the social and cultural influences on technology adoption and advocates for a balanced approach that integrates social values and cultural considerations, rather than solely relying on risk assessments for decision-making.
Charles Perrow's Analysis of High-Risk Systems
Charles Perrow discusses the risks associated with high-risk systems, highlighting the potential dangers of relying on risk assessors and the exclusion of social and cultural values in decision-making. He argues that the complexities of modern technologies, like nuclear power, pose significant risks despite varying opinions on the severity of potential accidents.
Practical Takeaways on Safety and Complexity
Perrow suggests that the trade-off between safety controls and system complexity is crucial in managing risks. He emphasizes the importance of considering simplicity in systems to reduce uncertainty and the impact of increasing complexity. Furthermore, he challenges the notion of removing operators from high-risk systems and advocates for understanding power dynamics and hierarchy in safety decisions.
The book explains Perrow’s theory that catastrophic accidents are inevitable in tightly coupled and complex systems. His theory predicts that failures will occur in multiple and unforeseen ways that are virtually impossible to predict.
Charles B. Perrow (1925 – 2019) was an emeritus professor of sociology at Yale University and visiting professor at Stanford University. He authored several books and many articles on organizations and their impact on society. One of his most cited works is Complex Organizations: A Critical Essay, first published in 1972.
Discussion Points:
David and Drew reminisce about the podcast and achieving 100 episodes
Outsiders from sociology, management, and engineering entered the field in the 70s and 80s
Perrow was not a safety scientist, as he positioned himself against the academic establishment
Perrow’s strong bias against nuclear power weakens his writing
The 1979 near-disaster at Three Mile Island - Perrow was asked to write a report, which became the book, “Normal Accidents…”
The main tenets of Perrow’s core arguments:
Start with a ‘complex high-risk technology’ - aircraft, nuclear, etc
Two or more values start the accident
“Interactive Complexity”
787 Boeing failures - failed system + unexpected operator response lead to disaster
There will always be separate individual failures, but can we predict or prevent the ‘perfect storm’ of mulitple failures at once?
Better technology is not the answer
Perrow predicted complex high-risk technology to be a major part of future accidents
Perrow believed nuclear power/nuclear weapons should be abandoned - risks outweigh benefits
Three reasons people may see his theories as wrong:
If you believe the risk assessments of nuclear are correct, then my theories are wrong
If they are contrary to public opinion and values
If safety requires more safe and error-free organizations
If there is a safer way to run the systems outside all of the above
The modern takeaway is a tradeoff between adding more controls, and increased complexity
The hierarchy of designers vs operators
We don’t think nearly enough about the role of power- who decides vs. who actually takes the risks?
There should be incentives to reduce complexity of systems and the uncertainty it creates
To answer this show’s question - not entirely, and we are constantly asking why
Quotes:
“Perrow definitely wouldn’t consider himself a safety scientist, because he deliberately positioned himself against the academic establishment in safety.” - Drew
“For an author whom I agree with an awful lot about, I absolutely HATE the way all of his writing is colored by…a bias against nuclear power.” - Drew
[Perrow] has got a real skepticism of technological power.” - Drew
"Small failures abound in big systems.” - David
“So technology is both potentially a risk control, and a hazard itself, in [Perrow’s] simple language.” - David