Safety and Information Flow

After reading Sidney Dekker’s book The Field Guide to Understanding ‘Human Error’, here are some of my emerging reflections and connections. As a consequence of me forming my thinking, I’ll be developing this series over the next few weeks.

A consistent theme throughout the series is the concept of information flow and the safety culture within organisations. Information flow not only impacts physically safety in many industries; it is also an indicator of organisational effectiveness regardless of their industry. As Ron Westrum states, “By examining the culture of information flow, we can get an idea of how well people in the organization are cooperating, and also, how effective their work is likely to be in providing a safe operation.”

Book summary

Based on my notes from the book, here is a summary (aided by Google Gemini)

The “Field Guide” distinguishes between the “Old View” and the “New View” of human error. In the Old View, human errors are the primary cause of problems, and the focus is on controlling people’s behavior through attitudes, posters, and sanctions.

In the New View, human errors are seen as a symptom or consequence of deeper organizational issues, and the focus is on understanding why people behaved in a certain way and improving the conditions in which they work.

The Field Guide provides examples of practices that indicate an organization adheres to the Old View, such as investigations and disciplinary actions being handled by the same department or group, and behavior modification programs being implemented to address human error. These practices tend to suppress the symptoms of human error rather than addressing the root causes.

Information Flow

In this article of the ‘human error’ series, I relate Ron Westrum’s concept of information flow to ‘human error’.

In Ron Westrum’s paper on The Study of information flow, he writes

The important features of good information flow are relevance, timeliness, and clarity. Generative environments are more likely to provide information with these characteristics, since they encourage a “level playing field” and respect for the needs of the information recipient. By contrast, pathological environments, caused by a leader’s desire to see him/herself succeed, often create a “political” environment for information that interferes with good flow.

Westrum, The study of information flow, Safety Science 67 (2014) 58–63

As Westrum describes in his paper, generative environments are those that focus on mission, where everything is subordinated to performance aided by a free flow of information. Pathological environments are characterised by a fear and threat, where individuals hoard information for political reasons.

Between those environments, he identified bureaucratic organisations which tend to retain information within departments for protection.

These cultural styles determine how organisations are likely to respond to ‘human errors’, such as mishaps and mistakes, and to unwelcome surprises such as project-halting issues. In his paper, Westrum sets out ways in which organisations may respond to anomalous information:

  1. the organization might “shoot the messenger.”
  2. even if the “messenger” was not executed, his or her information might be isolated.
  3. even if the message got out, it could still be “put in context” through a “public relations” strategy.
  4. maybe more serious action could be forestalled if one only fixed the immediately presenting event.
  5. the organization might react through a “global fix” which would also look for other examples of the same thing.
  6. the organization might engage in profound inquiry, to fix not only the presenting event, but also its underlying causes.

High performance organisation achieve success with global fixes and inquiry where the ‘bearer of bad news’ is welcomed and supported by their leaders and peers.

Practitioner’s insights

There are means to reasonably approximate where along Westrum continuum an organisation may be. An organisation’s position along the continuum will likely reveal why intractable problems may be nigh on impossible to address.

In my own work, I’ve helped a financial services organisation monitor their position by conducting regular surveys. For the department in focus, individuals at all-levels were asked to rate anonymously each of the following kinds of statements:

  • Cross-functional collaboration is encouraged and rewarded
  • Failures are treated primarily as opportunities to improve the system
  • Information is actively sought
  • Messengers are not punished when they deliver news of failures or other bad news
  • New ideas are welcomed
  • Responsibilities are shared

This particular set of statements are from the DORA Research.

To understand the state of information flow within an organisation, practitioners should consider employing the same approach.

Connections with other concepts

Over the next few weeks I’ll be adding to this series of posts which makes further connections between aspects of The Field Guide and the following concepts:

Safety: Stressors, Resilience & Drift

After reading Sidney Dekker’s book The Field Guide to Understanding ‘Human Error’, here are some of my emerging reflections and connections. As a consequence of me forming my thinking, I’ll be developing this series over the next few weeks.

A consistent theme throughout the series is the concept of information flow and the safety culture within organisations. Information flow not only impacts physically safety in many industries; it is also an indicator of organisational effectiveness regardless of their industry. As Ron Westrum states, “By examining the culture of information flow, we can get an idea of how well people in the organization are cooperating, and also, how effective their work is likely to be in providing a safe operation.”

Book summary

Based on my notes from the book, here is a summary (aided by Google Gemini)

The “Field Guide” distinguishes between the “Old View” and the “New View” of human error. In the Old View, human errors are the primary cause of problems, and the focus is on controlling people’s behavior through attitudes, posters, and sanctions.

In the New View, human errors are seen as a symptom or consequence of deeper organizational issues, and the focus is on understanding why people behaved in a certain way and improving the conditions in which they work.

The Field Guide provides examples of practices that indicate an organization adheres to the Old View, such as investigations and disciplinary actions being handled by the same department or group, and behavior modification programs being implemented to address human error. These practices tend to suppress the symptoms of human error rather than addressing the root causes.

Using stressors to gain safety resilience

Here’s my considerations relating Nassim Taleb’s thinking on stressors and Dekker’s work.

In his Antifragile book, Taleb writes

The crux of complex systems, those with interacting parts, is that they convey information to these component parts through stressors, or thanks to these stressors: [you] get information about the environment not through your logical apparatus, your intelligence and ability to reason, but through stress.

Nassim Taleb, Anti-Fragile

I believe Taleb’s statement relate to what Dekker writes about Erik Hollnagel’s concept of safety resilience

[Resilience is] the ability of a system to adjust its functioning before, during or after changes and disturbances, so that it can sustain required operations under both expected and unexpected conditions. This needs to translate into using the variability and diversity of people’s behavior to be able to respond to a range of circumstances, both known and new.

Sidney Dekker, The Field Guide to Understanding ‘Human Error’

To me, incidents such as mistakes and mishaps are stressors which individuals should feel comfortable to share. A condition for sharing are individuals knowing they can do so without fear of retribution or intolerance.

This sharing and the subsequent adjustments to their environment is ideally done in near real-time, and in a manner satisfactory to those closest to the problem; and less through formal procedures.

This ability to safely share stressors for continuous and appropriate improvement creates resilience. It creates resilience against stock-events since it flushes out and addresses weakness that are typically hidden in plain sight. It allows people, processes and services to continuously adapt in a contextual manner.

On the other hand, formal procedures often result in delay and are further hampered through the use of erroneous targets (e.g. targeting zero incidents). Those targets may satisfy the needs of a somewhat separate function, but create perverse incentives for those making decisions in the frontline.

A typical consequence are frontline colleagues striving for flexible practices despite the purported support of formal procedures. Often frontline colleagues establish workarounds and ‘inside knowledge’ which become normalised. Sentiment often extends towards cynicism of the internal ‘powers that be’ and a collective feeling of ‘us-vs-them’.

Drift

The mismatch between formal procedures and practices grows over time. It’s an almost universal phenomenon which Dekker describes as Drift. For many in the frontline, it’s a necessary departure from the original ideal, on which counter-productive targets, expectations and officialdom is set. The over-design of procedures takes agency away from the very people who typically know best and instead favours individuals who are rule-followers.

Consider your average corporation, government department or large nonprofit, and we will sadly find this corrosive phenomenon to be endemic. Often entire departments and functions exist to create and prop-up counter-productive ill-fitting procedures.

Mistakes and Mishaps

Hoffman’s Pyramid of Advantage and Catastrophe
Illustrated in the book Unlearn, by Barry O’Reilly

Above I use the words ‘mistakes’ and ‘mishap’ with specific meaning. Ed Hoffman’s states a ‘mistake’ is when something doesn’t go to plan; a ‘mishap’ occurs when the initiative or operation doesn’t fail, but an element of it is wrong.

‘Mistakes’ and ‘mishaps’ are stressors that indicate something is not performing as it should.

It should be down to the frontline individuals to recognise, share and collectively address such incidents. It should be down to their managers to ensure these individuals feel safe to share such incidents, and ensure that the learning is accounted for, regardless of how deep it impacts the organisation.

This sharing and learning can be described as information flow through an organisation (more on this later in the series). A clear upside is the organisation’s ability to adjust itself in a timely and necessary manner.

Practitioner’s insights

The success of open, safe and frank enquiry which leads to lasting change will depend on the organisation’s culture. So, when considering the likelihood of an organisation addressing issues and their underlying causes, practitioners should consider the wider cultural environment.

This experience is backed up by the work of Ron Westrum, which I introduce in the next article. In that article I provide practitioners with a practical recommendation to understand the cultural environment and how it relates to the concept of information flow.

Connections with other concepts

Over the next few weeks I’ll be adding to this series of posts which makes further connections between aspects of The Field Guide and the following concepts:

Thanks

Many thanks to Anne Gambles, Zsolt Berend, Nawel Lengliz and Carlo Volpi for giving feedback on this article.