Jump to content
  • Why investigate: Part 8 – Why an ‘It’s an error trap conclusion’ is an error trap


    MartinL
    • UK
    • Blogs
    • New
    • Health and care staff, Patient safety leads, Researchers/academics

    Summary

    In Part 8 of the 'Why investigate' blog series, Martin Langham takes a look at the hub's error trap gallery and explains why when we conclude it's an error trap we are missing the bigger picture.

    Content

    Well, this sounds like I have moved from my normal citation of Greek philosophers and Classical Greek terms like ‘ergonomics’ straight through to the Avant Garde poetry of the 1950s. An error trap is an error trap. That is either profound, or Martin has got into the evidence locker again and is smoking ‘Exhibit A’. The idea I am going to advance is that an ‘error trap’ as described on the hub pages really is a simplistic trap, to trap the untrained investigator. There is, after all, a regulator of all this forensic stuff which might help here.

    In 2019 (when the world was simple), I said we shall look at human reasoning and biases. The Vox Populaire of you dear readers, wants to understand ‘confirmation bias’.  It’s an example of an error trap in terms of poor thinking and it’s something the regulator has issued guidance upon. Indeed, there is a 90-page guide on confirmation bias in investigations when humans investigate other humans – super and proper Human Factors (HF) stuff it is.

    Now, as Corporal Jones (British TV sitcom for our overseas visitors) will say, don’t panic! We shall consider this stuff and the idea of what the regulator says together. Hand in hand we can get to grips with how to investigate HF stuff in a way the regulator would love. Yes, there is a regulator and no it’s not a club or institute suggesting team talks or simple two photo explanations to why people die. If you are keen to read on, I would avoid all the stuff on HF and statistical fallacy – well all the statistics stuff really - we will do this together. I taught postgraduate stats for 15 years and never lost a student, well they were found – eventually – in a statistically significant respectable time I would say!

    It is time to click on the image link below to see the images we are talking about and come back here.

    Error traps page

    Welcome back – impressive they look and clearly the cause of the incident. Meh, no.

    The concept of an error trap is like a mouse trap. A mouse trap catches more than just mice – fingers of the unwary while setting the spring, and the toes and hems of the pyjamas wearing midnight snacker! These error traps catch more than what you expect. The simple idea of “look this looks like something else”, is actually an indication of sloppy thinking. Not convinced? Read on.

    So why is an error trap a sign that it’s the first time a person is doing forensics? Well dear reader you have correctly used the word ‘why’, and in blog 6 we discovered ‘why’ is a powerful word.  We have decided the ‘how’ word is not so cool. The idea of an error trap is the ‘how’, and clever science types use the ‘why’ word. So, let’s use the word why as we look at the images on other hub pages.

    Look at another one and ask why.

    Error traps page

    Welcome back.

    The first question is why, if 30 million bottles look like this, there have only been two incidents. If it was a mouse trap, then shops will be getting a call to ask for a refund – I’ve 30 million mice in the house and only two look a bit poorly. True the cat is sitting there smugly as any luddite would gazing upon any technology. Return to your favourite image and return again quickly.

    Error traps page

    Welcome back.

    Next question is ‘why did it occur in this environment, and to this person, undertaking this job’? Ahh you might be thinking, this is the basis of human factors, the cave dweller in an environment, doing a task with a tool or bit of equipment. The error is only so in a given environment, with the caveman doing a particular task, at that moment in time. Yes, we are back to earlier blogs about what HF is, and what an accident is (see blog 5). Have another look at your second favourite image.

    Error traps page

    Welcome back.

    The question ‘why’ starts with understanding the environment the cave dweller was using the kit in. We often get asked to review a so called ‘error trap diagnosis’ (done by other ‘professionals’ when a similar incident has occurred again) and find that there are many other causes. Let’s think about human vison and lighting. One case is the medication labels on two different bottles look very different under incandescent and fluorescent light, but in LED (especially 6500K) (especially the weird blue) they look the same. Sometimes, therefore, the environment means both labels change to look the same or sometimes different. Colour, hue and saturation means that in, for example, objects under high pressure sodium lights, all pinks and reds look the same. Slightly different for low pressure sodium lights. Now this should ring some bells from the last blog where I said a person that does team talks without an MSc or PhD might not be the person to do your HF investigation. Have a look at another image.

    Welcome back.

    Yes, you guessed it, the next why is – why did the incident occur doing that task? Well, we often find the task is poorly designed and the interaction with that bit of kit has been poorly thought through (more in later blogs about siloes). The next why is of course why that human – typically, fatigue, distraction and the alike are the guilty parties. This all should start ringing alarm bells (oh we will cover the psychology of alarms soon) in that in our first time together we decided that an accident did not have a single cause (see blog 1).

    No need to go away again, but a so called ‘error trap analysis’ may only be focussing on one factor – that is, the poor design of the bottles – whilst ignoring all the other factors that contributed to the accident, namely the environment, the person and the task. Look around you and see how many controls buttons, etc, all look the same. Look at the cooker in your kitchen (mind the mouse trap); look at all those buttons that look the same. On the flight deck don’t they all look the same? Are they error traps? Exactly, an error trap analysis is there for those who have not done the forensic training.

    The final why to ask is why did the system or equipment designer make that mistake. Well as earlier blogs have hinted, medicine is unique and we HF types have spent ages looking at oil, gas and railways, etc, but not at medicine – hence I’ve said any HF person needs to spend a lot of time understanding what each unique trust, ward, theatre, shift change is doing.

    What other error traps have we found not to be really an error trap you may ask – well airside operations where the levers look the same on the power unit, and this was indeed interesting but the operator was drunk. Indeed, another case the in-cab system icons did all look the same – but at that level of medication of the driver, it’s not surprising what happened. My favourite was the error trap where the operator made the mistake not because of the kit design (it was really bad), but if you do ask your brother to do the shift and they have not actually got any training/licence/ID and wear prescription glasses which are not theirs (so at security they look like the real person)… Always use the cognitive interview and use some learning from the behavioural sciences about memory. Facts remember are our friend (see blog 5)

    There is a nice website called ‘Bad human factors designs’. One of my students at the Royal College of Art found it many years ago and it is as good today as it has ever been.

    In our courses – well I’m retired now so should be past tense – we introduced the idea of the error trap along with the forensic photography section. By the end of the day the idea was that the students would settle down to dinner chatting about why an error trap analysis/conclusion originally looked like a good idea, but now they knew its limitations.

    Summary

    So, today we have used the earlier blogs to understand there is a science regulator, that seeing only an error trap analysis/conclusion is a quick way to spot the lesser quality HF investigation. In essence, looking only for the ‘error trap’ is like a film director with the camera on a tight focus, you need to widen the camera angle  to see what else maybe in shot.

    The only bit we have not revisited is classical philosophy and as I have done so three times this week lets return to that. Voltaire well more precisely François-Marie Arouet (21 November 1694 – 30 May 1778) wrote:

                  “To the living we owe respect, but to the dead we owe only the truth”.

    Which reminds us why we investigate.

    You may also note some clever HF maths on this page – well-hidden, but we shall return.

    Finally, one of my editors points out that to the naïve observer those error traps are so damn obvious you never need an HF person to tell you what went wrong.  In their view it scares people away. Now, hopefully, you see why I’ve explained carefully why you need an HF forensic type as the real reason or proximate cause is hidden.

    This is important as healthcare now starts to move away from HF. “We have tried HF and its rubbish with no measurable outcome” and “we know about how pilots are supposed to talk to each other but it’s not like that in the emergency department  – there are more than two of us”. The point is healthcare has not really tried much real HF.

    Next time

    Very finally 2021 sees me even more retired so future blogs on investigations will see guest writers. Keen young MSc/PhD wielding types (Bobbie on fatigue, Lara on ethics, Afiah on philosophy), slightly less young professorial types with dusty PhD documents talking about situational awareness (Professor Edgar), decision making (from Professor Stedmon-  to kick us off), all from diverse domains, including healthcare, emergency response, transport, security, defence, etc.

    Read the other blogs in this series

     

    About the Author

    Martin is a topic leader for the hub.

    He founded the Human Factors group at the University of Sussex (1999), which became User Perspective Ltd in 2003. Martin, User Perspective MD and Chief Scientist, aided by his team, has undertaken almost 600 research and forensic investigation projects. He is interested in human error and human factors.

    Martin is a research auditor for the UK government, EU academic networks and many governments worldwide. Within healthcare he has investigated matters as diverse as neonatal safety in transport, unexplained injuries in the hospital mortuary, sepsis diagnosis and retained instruments. Martin co-authored the very first Healthcare Safety Investigation Branch (HSIB) report that investigated orthopaedic surgery in the UK and Europe. His interest in the law and justice extends to his voluntary role as a justice of the peace (JP) in the Magistrate and Crown courts.

    0 reactions so far

    0 Comments

    Recommended Comments

    There are no comments to display.

    Create an account or sign in to comment

    You need to be a member in order to leave a comment

    Create an account

    Sign up for a new account in our community. It's easy!

    Register a new account

    Sign in

    Already have an account? Sign in here.

    Sign In Now
×
×
  • Create New...