Dr. Steven Shorrock, a chartered psychologist and human factors specialist, discusses the intricate relationship between human behavior and safety in high-stakes environments. He emphasizes the shift from analyzing errors to adopting a balanced view of both successes and failures. The conversation dives into the concept of 'just culture' and the complexities of neurodiversity, advocating for understanding strengths in conditions like ADHD. Shorrock also critiques conventional error analysis, proposing a new approach to understand cognitive errors in critical decision-making contexts.
The language used in safety incident analyses profoundly affects organizational learning, and a focus on errors can obscure understanding and promote blame.
Dr. Steven Shorrock’s unique experiences as an autistic individual shape his insights on human behavior, fostering a nuanced comprehension of safety incidents.
Shifting from a taxonomy-based to a topology-based analysis encourages safety professionals to appreciate complex interactions in human systems and enhance risk mitigation strategies.
Deep dives
The Importance of Language in Safety Incident Analysis
The use of language significantly impacts how organizations analyze safety incidents and incidents. A common issue arises from a deficit-based language that focuses predominantly on errors, which can obscure a comprehensive understanding of what happened. This fixation on what went wrong often leads to judgment and blame, detracting from effective learning opportunities. Moving towards a more neutralized taxonomy reduces the negative labeling of actions and allows for constructive conversations about decisions and behaviors in context.
Understanding Human Behavior Through Personal Experience
Dr. Stephen Shorrock's personal background as an autistic individual has profoundly shaped his understanding of human behavior and systems. His early life experiences fostered an intense interest in exploring why people act as they do, leading him to pursue a career focused on human factors in aviation safety. This perspective enables him to analyze safety incidents not merely as mistakes but as consequences influenced by complex interactions in human systems. Such insights demonstrate the importance of recognizing individual backgrounds and experiences in shaping expertise in safety-critical fields.
Shifting Focus from Taxonomy to Topology in Safety Analysis
The discussion underscores a shift in the analysis of safety incidents from a taxonomy-based approach to a topology-based perspective. Traditional taxonomies often emphasize mechanical and error-based analyses, leading to a narrow focus on blame rather than understanding the system-wide factors that contribute to errors. By considering dispositional risk factors and the broader context, safety professionals can develop more robust insights into how to mitigate risks effectively. This approach encourages a nuanced analysis that takes into account the varying degrees of human decision-making under pressure.
Just Culture and Shared Accountability
Adopting a just culture within organizations involves balancing individual accountability with shared responsibility for safety outcomes. This concept emphasizes the need for organizations to recognize the complex dynamics at play in safety-critical settings rather than defaulting to blame in the face of adverse events. By fostering a culture of openness and learning, organizations can better address systemic issues that contribute to incidents. Compassionate dialogues centered around understanding decisions and circumstances can lead to more effective safety improvements.
The Future of Safety Analysis: The Role of Creativity
Future developments in safety analysis are expected to be driven by creativity and innovative approaches. Embracing creativity in analyzing human factors and safety incidents allows for unique methods of understanding complex systems. Dr. Shorrock's exploration of diverse mediums, from making magazines to using beer mats to illustrate human work proxies, illustrates this shift toward engaging and inventive strategies. This attitude emphasizes that creativity is not just essential in artistic fields, but also plays a crucial role in enhancing safety practices and understanding human behavior in challenging environments.
On this episode of the Salience Podcast, we turn our attention to the relationship between human factors, systems, and analysis of safety incidents and accidents. As you might imagine, a company with a name like Frontline Mind is intimately involved with frontline action. The agencies and people we specialize in work in fast-paced, complex, and at times high-risk environments. Inevitably, there are near misses, incidents and accidents.
How we best learn from these is not straightforward. In fact, when we work with agencies for the first time, we find that most after-action reviews or operational or cold debriefs have made matters worse. Partly, this is because there is an unrealistic focus on events that emerge and are only visible in hindsight. This is partly because there is a strong focus on errors and what went wrong. We can see and hear this bias in a deficit-based language where there is a focus on what went wrong. Now, I'm not a fan of an exclusive focus on what went well either. The danger of overdone positivity or staying in happy, clappy land is also unhelpful and is just as much of a concern as an obsession with what went wrong. So it's this use of language and the way we can direct attention that we are going to focus on today. And to help us unpack, how we can learn from near misses, incidents and accidents without falling into a judgmental binary of good and bad.
we are joined by Dr. Steven Shorrock from Eurocontrol, where he works to support aviation throughout Europe with human factors, applied psychology and systems thinking and practice. Steven is a chartered psychologist and human factor specialist. He is editor in-chief of Hindsight Magazine and adjunct associate professor at the University of the Sunshine Coast Center for Human Factors and Sociotechnical Systems.