Jump to content
  • Error isn’t a problem – the problem is the word ‘error’: A blog by Norman MacLeod


    NMacLeod
    • UK
    • Blogs
    • New
    • Everyone

    Summary

    It has become fashionable to purge the term ‘error’ from the safety narrative. Instead, we would rather talk about the ‘stuff that goes right’. Unfortunately, this view overlooks the fact that we depend on errors to get things right in the first place. We need to distinguish between an error as an outcome and error as feedback, writes Norman MacLeod in this blog for the hub.

    Content

    In an increasingly litigious world, intolerant of failure, error has become inextricable linked with fault and blame. Here, error is considered in hindsight by agents in positions of power or with specific agendas. Something happened and someone must pay. Clichés such as ‘error is natural’ or ‘no one intends to make a mistake’ carry little weight. Unfortunately, this interpretation of ‘error’ feeds into debates in the safety domain but simply rejecting the term misses the point.

    To understand the importance of error we need to reflect on the nature of the world. Imagine a small pile of sand on a table. As you add more grains of sand, the cone will build, maintaining its shape until, eventually, the next single grain will trigger a cascade. Sand will slip down the side of the cone until a new shape is stabilised. And, so, the process goes on. The cone is stable under most circumstances but just a single grain of sand can trigger a transition to a new stable state. The world, then, exists in a state of self-organised criticality. This is important. If the world was too stable, it would not be able to respond to change. Instability, then, is an adaptive property. It also means that work must contend with this inherent instability. 

    We need to be constantly adapting to events as we encounter them, which might not be how we anticipated them at the outset. It is this mismatch between ‘expected’ and ‘actual’ that is one source of error.

    But there is a more fundamental process that gives rise to error. All action flows from decisions made by a brain encased in bone. It has no direct access to the outside world. The brain acts like a Bayesian probability engine. The brain creates a set of expectations about the nature of the world, and these are compared with sensory inputs. Any discrepancies – errors – are resolved until our perceived reality meets a threshold. Our investment in establishing ‘reality’ is just enough to support whatever action is needed to achieve our goals. This last statement presupposes that all action is goal directed. Error, in this context, is feedback from the world about the correlation between our actions and our progress towards our goal. In fact, error is information that reduces uncertainty. In this sense, error allows us to fine-tune our actions.

    Studies of airline pilot performance reveal that about a third of errors committed by crew go unnoticed. They are seen by the trained observer, but not by the perpetrators, and barely 1% of these errors have any sort of impact on the operation of the aircraft. This suggests two things: first, in aviation at least, the operation is resilient and can cope with error; second, the consequence of error does not seem to impinge upon the crew’s understanding of what is happening to the extent that they need to take any action. However, when an error does come to the attention of the crew, a response is needed. Again, studies show that a significant proportion of detected errors are simply ignored by crew. Fewer than half require a positive intervention. It is fashionable to talk about error ‘management’. In fact, crew do not ‘manage’ errors: instead, they respond to the new set of circumstances created by the error. 

    Error is the trace you leave behind, like the wake of a ship. You play what is in front of you and don’t look back.

    But what about the ‘things that go right’? Here is a game you can play. Imagine you are watching someone in the workplace. How do you know things are going right? Probably, it’s because you haven’t seen anything going wrong. We are designed to detect ‘wrong’ because that is what will save our lives. It’s an evolutionary thing. We are blind to ‘right’ because that is simply our expectations – the brain’s prediction – being met. That said, have you ever been impressed by something you have seen at work? Again, this is our prediction not being met, but in a surprising way rather than a negative way.

    Surprises, like failures, are learning opportunities. Both allow us to refine our internal representations of tasks, leading to better goal specification and richer action sequences directed at attaining that goal. 

    Error, then, is not only good but also essential. The original meaning of ‘error’ was to wander. It is not the wandering that really matters but the path people were trying to follow in the first place.

    Key take away points:

    1. After a process failure, the goal is to explain the gap between planned and actual. Culpability comes a distant second.

    2. Most responses to adverse events merely shift the point of failure.  The work will be no less variable and the role of error will not change.

    3.  If you really need a 'Just Culture' policy it suggests that the people with power do not understand error.

    About the Author

    Norman MacLeod served for 20 years in the RAF involved in the design and delivery of training in a variety of situations. He stumbled across 'CRM' in 1988 while investigating leadership in military transport aircraft crews. From 1994, he worked around the world as a consultant in the field of CRM in commercial aviation, latterly employed as the Human Factors Manager for a blue chip airline in Hong Kong.  Now semi-retired, he is one of the Patient Safety Partners at James Cook Hospital in Middlesbrough.

    1 reactions so far

    4 Comments

    Recommended Comments

    Ann, I’m glad you found it interesting. I’ve drawn on thinking about learning (Ohlsson) and various others working in sense making and neuroscience.  My motivation is to do something about the lazy use of language in safety circles.  Space is limited in these blogposts. The implications of my position maybe need another blog.

    • 0 reactions so far
    Link to comment
    Share on other sites

    Great article, really love the conception of error as 'wake'.

    However I think some differences between the airline and hospital environments are worth bearing in mind when thinking about the importance of 'Just Culture'. Medical errors in hospitals have the potential to cause more severe consequences those made by air crew, who these days really need to be trying quite hard to seriously injure or impair flying customers. 'Care for the caregiver' and 'Just Culture' are vital to ensure adverse healthcare events don't create 'second victims' or get covered up by embarrassed staff. 

     

    • 0 reactions so far
    Edited by Guy Butler
    Link to comment
    Share on other sites

    The key difference between the airline and hospital environments is that that airlines manage their processes. This is a pre-requisite for error reduction in healthcare. 

    • 0 reactions so far
    Link to comment
    Share on other sites

    Create an account or sign in to comment

    You need to be a member in order to leave a comment

    Create an account

    Sign up for a new account in our community. It's easy!

    Register a new account

    Sign in

    Already have an account? Sign in here.

    Sign In Now
×
×
  • Create New...