Safety: Stressors, Resilience & Drift

After reading Sidney Dekker’s book The Field Guide to Understanding ‘Human Error’, here are some of my emerging reflections and connections. As a consequence of me forming my thinking, this post and its series will likely change over the next few weeks.

A consistent theme throughout the series is the concept of information flow and the safety culture within organisations. Information flow not only impacts physically safety in many industries; it is also an indicator of organisational effectiveness regardless of their industry. As Ron Westrum states, “By examining the culture of information flow, we can get an idea of how well people in the organization are cooperating, and also, how effective their work is likely to be in providing a safe operation.”

Book summary

Based on my notes from the book, here is a summary (aided by Google Gemini)

The “Field Guide” distinguishes between the “Old View” and the “New View” of human error. In the Old View, human errors are the primary cause of problems, and the focus is on controlling people’s behavior through attitudes, posters, and sanctions.

In the New View, human errors are seen as a symptom or consequence of deeper organizational issues, and the focus is on understanding why people behaved in a certain way and improving the conditions in which they work.

The Field Guide provides examples of practices that indicate an organization adheres to the Old View, such as investigations and disciplinary actions being handled by the same department or group, and behavior modification programs being implemented to address human error. These practices tend to suppress the symptoms of human error rather than addressing the root causes.

Using stressors to gain safety resilience

Here’s my considerations relating Nassim Taleb’s thinking on stressors and Dekker’s work.

In his Antifragile book, Taleb writes

The crux of complex systems, those with interacting parts, is that they convey information to these component parts through stressors, or thanks to these stressors: [you] get information about the environment not through your logical apparatus, your intelligence and ability to reason, but through stress.

Nassim Taleb, Anti-Fragile

I believe Taleb’s statement relate to what Dekker writes about Erik Hollnagel’s concept of safety resilience

[Resilience is] the ability of a system to adjust its functioning before, during or after changes and disturbances, so that it can sustain required operations under both expected and unexpected conditions. This needs to translate into using the variability and diversity of people’s behavior to be able to respond to a range of circumstances, both known and new.

Sidney Dekker, The Field Guide to Understanding ‘Human Error’

To me, incidents such as mistakes and mishaps are stressors which individuals should feel comfortable to share. A condition for sharing are individuals knowing they can do so without fear of retribution or intolerance.

This sharing and the subsequent adjustments to their environment is ideally done in near real-time, and in a manner satisfactory to those closest to the problem; and less through formal procedures.

This ability to safely share stressors for continuous and appropriate improvement creates resilience. It creates resilience against stock-events since it flushes out and addresses weakness that are typically hidden in plain sight. It allows people, processes and services to continuously adapt in a contextual manner.

On the other hand, formal procedures often result in delay and are further hampered through the use of erroneous targets (e.g. targeting zero incidents). Those targets may satisfy the needs of a somewhat separate function, but create perverse incentives for those making decisions in the frontline.

A typical consequence are frontline colleagues striving for flexible practices despite the purported support of formal procedures. Often frontline colleagues establish workarounds and ‘inside knowledge’ which become normalised.


The mismatch between formal procedures and practices grows over time. It’s an almost universal phenomenon which Dekker describes as Drift. For many in the frontline, it’s a necessary departure from the original ideal, on which counter-productive targets, expectations and officialdom is set. The over-design of procedures takes agency away from the very people who typically know best and instead favours individuals who are rule-followers.

Consider your average corporation, government department or large nonprofit, and we will sadly find this corrosive phenomenon to be endemic. Often entire departments and functions exist to create and prop-up counter-productive ill-fitting procedures.

Mistakes and Mishaps

Hoffman’s Pyramid of Advantage and Catastrophe
Illustrated in the book Unlearn, by Barry O’Reilly

Above I use the words ‘mistakes’ and ‘mishap’ with specific meaning. Ed Hoffman’s states a ‘mistake’ is when something doesn’t go to plan; a ‘mishap’ occurs when the initiative or operation doesn’t fail, but an element of it is wrong.

‘Mistakes’ and ‘mishaps’ are stressors that indicate something is not performing as it should.

It should be down to the frontline individuals to recognise, share and collectively address such incidents. It should be down to their managers to ensure these individuals feel safe to share such incidents, and ensure that the learning is accounted for, regardless of how deep it impacts the organisation.

This sharing and learning can be described as information flow through an organisation (more on this later). A clear upside is the organisation’s ability to adjust itself in a timely and necessary manner.

Practitioner’s insights

The success of open, safe and frank enquiry which leads to lasting change will depend on the organisation’s culture. So, when considering the likelihood of an organisation addressing issues and their underlying causes, practitioners should consider the wider cultural environment.

This experience is backed up the work of Ron Westrum, which I introduce in the next article. In that article I provide practitioners with a practical recommendation to understand the cultural environment and how it relates to the concept of information flow.

Connections with other concepts

Over the next few weeks I’ll be writing this series of posts which makes further connections between aspects of The Field Guide and the following concepts:

  • Anti-Fragility, Drift and Resilience
  • ‘Human Error’ and Ron Westrum’s Information Flow
  • Normalisation of Deviance and the idea of Gradual Acceptance
  • Could Wardley mapping be used to map the inattention emergence of drift
  • Drift and Cynefin’s zone of complacency
  • Roger Martin’s Strategic Choice Cascade
  • Hosfstede’s cultural dimensions theory
  • Mary Uhl-Bien’s Leading with Complexity


Many thanks to Anne Gambles, Zsolt Berend, Nawel Lengliz and Carlo Volpi for giving feedback on this article.