In the previous blog in the 'Why investigate' series, we heard from Professor Martin Langham about the error trap being an error trap in itself, and about changing our focus in investigations to look wider than simplistic ideas and models of causation. In this blog, Professor Alex Stedmon considers how we might make the wrong decision when we think it’s the right decision.
Martin has now passed the blog baton onto me. There will also be others contributing in due course to continue the thought-provoking and stimulating dialogue. Martin likes to quote Greek philosophy, but I rather like these words of Oscar Wilde:
“Religions die when they are proved to be true. Science is the record of dead religions”
As an agnostic Human Factors person, I’m not here to preach to the converted and I’m not evangelical about the ‘religion’ of Human Factors… it’s an important part of many things but the key word here is ‘part’ – it is vital that we understand things as being more than the sum of their parts and that we consider things from a number of perspectives.
So, here’s the question… how can we take in many different perspectives when we’re fundamentally not wired that way?
To illustrate this, I have a bad back. I was working in the garden before Christmas and managed to ‘twang’ it during some heavy lifting. Yes, there’s a sense of irony there with me being a Human Factors/ergonomics specialist and understanding about lifting things correctly and having some knowledge of biomechanics – it just goes to show it can happen to anyone! At the time it didn’t seem too bad. It has happened before and usually after a few days it begins to recover. However, that was back in December and I still have the pain in my back.
Since then I have been going over what I did, where the pain is at the moment, and why I am not recovering as quickly as before. I’ve been doing this each time my back sends a shot of pain to my brain. I was moving some logs, I twisted round, I felt the pain in my back. That is about the extent of my processing because I keep going round in circles – never quite getting to a solution.
We’re now into February and it is clear that this time my back injury does not fit with my idea of how it mends itself and so I have had to start looking for other explanations.
A couple of years ago my wife also had a bad back but her’s was much worse than mine. She had a bulging disc in her lower back causing extreme pain and sciatic issues. Could I have done something like that? Could I have a minor disc issue? Does it feel more like a nerve or muscular problem?
I’ve also continued to be very active outside, I’ve cleared a lot of ground, pruned trees, shifted railway sleepers around the garden, sawn the sleepers to make raised beds and filled them with over 3 tonnes of soil. I’ve conveniently forgotten that this activity might have hampered my recovery, or perhaps has caused new or continued strain on my back.
Put simply, my initial assumption about my back has slowly proven to be incorrect and I have had to look for other explanations for why it still aches.
Another example comes from a common case I investigate for the court. Imagine you’re driving on a motorway. There is a car ahead of you in the outside lane, some way off in the distance. It looks to be behaving normally, it is pointing in the right direction and so you might reasonably assume it is moving.
What happens if you then realise it has stopped and you only have moments to react? You could be caught unawares by this sudden realisation that the vehicle you thought was moving is actually a static object that you might collide with.
What has happened here is that you made an assumption based on an expectation that the car would be moving (how many parked cars do we expect to see in the outside lane of a motorway?). Everything else you then processed sought to confirm this was the case (pointing in the right direction, tick, no brake lights, tick, no hazard lights, tick, no people standing by the vehicle, tick, no other vehicles stopped or signs of a collision, tick). None of those things that you’ve just mentally ticked off would give you any reason to think that the car is stationary. From a Human Factors perspective, the key point here is that you would not have started out actively looking for evidence that the car was stationary.
We call this aspect of human decision-making, ‘confirmation bias’. Confirmation bias refers to a well-documented fallibility of decision-making where we make an initial judgement about something and then seek to confirm that judgement is correct before we go on to look for other explanations. But it is more serious than that. We “tend to seek (and therefore find) information that confirms the chosen hypothesis and to avoid information or tests whose outcome could disconfirm it” (Wickens, 1984). This means that we might ignore conflicting information in preference to that which fits with our idea of what is happening.
When we are faced with an uncertain situation, we will try to make sense of it as best we can. We may not have a clear idea of what is actually happening and, if our initial assumption is incorrect (i.e. the evidence eventually illustrates that our interpretation was wrong), we then have to find another explanation for the situation and begin collecting new evidence to support that.
So, why can’t we just try to process everything and understand things more clearly? Well, in very broad terms, those studying human performance have concluded that us poor humans have limited mental resources and that we are not able to do everything all at once. We would quickly become overwhelmed by the sheer amount of data we need to process so we have to find quick and easy ways to try and make sense of things. Through experience we develop normally useful shortcuts (we call them heuristics) in our reasoning and decision-making processes.
Typically, we will process information based on our experience of similar situations that we use as a template (or ‘mental model’) for how things are likely to be. This forms the basis of much of what we then ‘expect to happen’ in the world around us.
Psychologists are still arguing about whether we build up our understanding of the world from a ‘top-down’ or ‘bottom-up’ perspective. In broad terms, this means do we have a general idea of things to begin with, that we then backfill with information, or do we make sense of things by building up the blocks of our understanding? Perhaps the answer is both – in some situations we might jump to an overall idea about something (i.e. it fits the pattern of our previous experience) and in others we may need to piece together things in new ways (i.e. we may not have a coherent model stored in our brains from previous experience).
A key factor in this kind of decision making is our reliance on prior expectations. Drivers do not expect vehicles to be stationary in the outside lane of a motorway and therefore may not process the information about a vehicle they see ahead of them as that kind of potential hazard.
In his younger years, Martin conducted a very interesting piece of research to investigate why drivers might collide with highly visible police vehicles. The findings highlighted two things – some careful drivers failed to notice the police vehicle completely and, when the police vehicle was parked in the direction as the traffic, drivers assumed it to be moving. From these results a recommendation was made that emergency vehicles should park at an offset angle to the flow of the traffic so it is more obvious that they are not in a ‘normal’ orientation to the traffic.
So how does this translate to conducting investigations? At a simple level, we should always be cautious of our initial ideas about why something happened. Or if anyone else voices a solid understanding right at the start, maybe we should take a bit of time to step-back and try to survey the landscape from the highest vantage point we can find.
Incidents and issues we investigate are rarely caused by one thing. As already stated, it is vital that we understand things as being more than the sum of their parts and that we consider things from a number of perspectives. However, investigations often take reductionist approaches to simplify things. By seeking out discrete factors composed of only one or two parts (at most) it is easy to start with human error as the root cause.
Any investigation is a journey into unchartered territory so don’t be afraid to take time to get your bearings; look around you, what do you see? Look at the map (or perhaps any checklists or procedures) and don’t be scared of asking for directions from the locals (i.e. people close to the incident who may have valuable and unique perspectives on the issues).
All too often investigations start by making assumptions ‘to get things moving’ because we have limited resources or ‘to show progress is being made’.
If that is the starting point – be careful. While there can be organisational pressures to ‘get started’ and to ‘find out’ what happened, it can be dangerous if we run off into the woods without leaving a trail of pebbles to get home again (don’t use breadcrumbs, the birds tend to eat them!).
It is possible that initial ideas can trap us in cycles of confirmation bias, searching out the evidence that fits with each interpretation of events. At best, we may realise it is more complicated or our initial idea is incorrect and then require more time and resources to search out other explanations. At worst, we may not realise we’ve missed something important and could even set forth recommendations which do not really address the issue or stop it from happening again.
To counter this, always try to involve multidisciplinary experts in investigations. They will be people who have domain knowledge (not just qualifications on paper). But, perhaps more importantly, always keep an open but inquiring and critical mind. Never assume anything until you have discounted everything else.
Regardless of what may have happened, I always start an investigation with the words of Sherlock Holmes in my head … “when you have eliminated the impossible, whatever remains, however improbable, must be the truth”. If anything, we should seek out the most unrealistic explanations rather than the most obvious ones. We are less likely to consider the least obvious ones later on and the obvious will always remain in plain sight.
Before I sign off, we’ve covered how we might misinterpret information we process, how people can fail to see even highly conspicuous police vehicles and how we are programmed to find easy and convenient ways to make sense of things. Hopefully understanding our decision making in complex investigations can help us be more critical of our thinking and more aware that sometimes we might make the wrong decision when we think it’s the right decision.
Read the other blogs in this series
- Why investigate? Part 1
- Why investigate? Part 2: Where do facts come from (mummy)?
- Who should investigate? Part 3
- Human factors – the scientific study of man in her built environment. Part 4
- When to investigate? Part 5.
- How or Why. Part 6
- Why investigate? Part 7 – The questions and answers
- Why investigate? Part 8 – Why an ‘It’s an error trap conclusion’ is an error trap
- Why investigate? Part 10: Fatigue – Enter the Sandman
- Why investigate? Part 11: We have a situation
- Why investigate? Part 12: Ethics in research
About the Author
Alex is a Human Factors contributor for the hub. He has over 25 years of research, teaching and consulting experience applying Human Factors expertise in various settings (e.g. virtual reality applications, simulation, transport, security, defence). With a background in Applied Psychology, Alex is interested in aspects of human information processing, systems design, and user requirements elicitation and evaluation.
Alex was employed at the Defence Evaluation Research Agency (now QinetiQ) before moving into academia and working at Loughborough University, The University of Nottingham, and Coventry University where he led Human Factors MSc courses and international research programmes, working closely with Government agencies and stakeholders.
Since 2011, Alex has run a successful business (Open Road Simulation Ltd) providing transport simulation and consultancy services. In 2019 Alex retired from academia and now works as an independent consultant helping international clients procure specialist road safety solutions. He also founded a second company (Science Witness Ltd) that provides expert witness services to the court specifically in legal aspects of driver and rider behaviour.