Jump to content

Investigations & Behaviours

Recommended Posts

This short summary is very much about understanding behaviours and motivations within a safety investigation framework.

Daniel Kahneman pioneered the science of behavioural economics and received the Nobel Prize for his work, he developed the concept of “Dual Process Theory” and how humans think fast and slow and how human thinking is affected by heuristics and cognition.  It’s about how humans think and make judgements. He identified that there were a wide range of cognitive/thinking biases, cognitive error refers to any errors (SRK) at any level in the hierarchy of thinking processes.

Kahneman & Tversky identified that these were particularly common in medical incidents with clinical error at the centre of those incidents.

When humans think about a problem and a solution, we use heuristic logic, a short cut system.  Our brains have evolved to make rapid decisions, a best guess often without considering all the facts before us.  It is vital for making quick and sometimes lifesaving decisions.  The problem is that clinical decisions often need a more considered approach, this can over time become habitual and leads to errors.

I don’t want to get too heavy on the theories here, there is plenty of information available on the net and also of interest is Dr Pat Croskerry, Dalhousie University who has identified 50 cognitive biases in healthcare.

Humans are part of the sociotechnical system, humans, machines and the organisation operating together.  In healthcare it includes people, interactions and relationships as part of a larger thing, operating collectively towards a common purpose.

Safety is a control problem, or can be viewed as such and safety is managed by a control structure.  Investigators therefore should be able to identify why the existing control structure failed, or which parts of it failed.  To prevent future error requires a control structure that can be designed that will enforce the necessary constraints on the system to ensure safe operation and can continue as such as changes occur.

What then of human error within the sociotechnical system, how do we understand it. Often it is discoverable evidence of the adverse event which leads to a finding of e.g. human error, but what we don’t seem to analyse so well is people’s intentions and behaviours which, do not come about from factual evidence per se, e.g. what motivated our decision or behaviour.

How do we understand this in the context of the investigation, identifying this can be extremely challenging, identifying what motivated how we perceived a situation?  However, this can lead to a richer understanding of the influences on human behaviour.

I have been looking at models that can be inserted into a core investigative doctrine for safety and to date I have settled on that posed by Dr Russell Kelsey MB.BS.MRCGP a subject matter expert in serious clinical investigation which places clinical error at the centre of three influences of attitude, attention and cognition and the effects of various biases within those three influences and situational awareness and high pressure environments.

I am not connected to Russell and also do not want to be seen as breaching any copyright within this but more information can be found in his book at https://www.amazon.co.uk/Patient-Safety-Investigating-Reporting-Incidents/dp/1498781160 which I have found very informative.

Has anyone developed a method/model that looks deeper at the context behind human error as I would be interested in the approach.  The new PSIRF will transform our approach refocusing on systems, processes and behaviours and whilst early adopters are trialling the framework it does not stifle discussion and consideration for improvements and would be interested of any developments already within investigation management .  

Many thanks


Investigations & Learning Specialist- RWHT

1 reactions so far

Hi Keith – all good stuff and all classic Cognitive Psychology (CP) and Human Factors (HF). Nice to read an email not about team work, non-technical skills or crew resource nonsense.

Might be worth a chat at some point?

Some thoughts...

A lot of the thinking and deciding experiments are those done in a lab and generalising them to a specific incident is a bit difficult.  Where medicine is at the very beginning of CP HF journey. Looking at 50 different biases in environments that are poorly designed with lots of bespoke untested equipment may be good – but most likely for our grandchildren.  Simple questions first. Is the equipment usable, does the system of working prevent error, is the human working within the limits of evolution? 

Excellent point on what you call habitual decisions – or automaticity or automation as we call it.  Think about driving a car – all (mainly) automatic decisions as we have not evolved to deal with that amount of information in such a dynamic environment.

I'm planning a blog on Situational and Spatial awareness, but those in the military that have been on my course comment- “You science types can't even agree how to measure it”.  There are differences in team and individual SA worthy of note.  I think in medicine the question to start with is “Who is in my team”

There are lots of models and methods of investigation. I’m trained in some of them but if they can be generalised to medicine, well answers below.

0 reactions so far

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Create New...