This short summary is very much about understanding behaviours and motivations within a safety investigation framework.
Daniel Kahneman pioneered the science of behavioural economics and received the Nobel Prize for his work, he developed the concept of “Dual Process Theory” and how humans think fast and slow and how human thinking is affected by heuristics and cognition. It’s about how humans think and make judgements. He identified that there were a wide range of cognitive/thinking biases, cognitive error refers to any errors (SRK) at any level in the hierarchy of thinking processes.
Kahneman & Tversky identified that these were particularly common in medical incidents with clinical error at the centre of those incidents.
When humans think about a problem and a solution, we use heuristic logic, a short cut system. Our brains have evolved to make rapid decisions, a best guess often without considering all the facts before us. It is vital for making quick and sometimes lifesaving decisions. The problem is that clinical decisions often need a more considered approach, this can over time become habitual and leads to errors.
I don’t want to get too heavy on the theories here, there is plenty of information available on the net and also of interest is Dr Pat Croskerry, Dalhousie University who has identified 50 cognitive biases in healthcare.
Humans are part of the sociotechnical system, humans, machines and the organisation operating together. In healthcare it includes people, interactions and relationships as part of a larger thing, operating collectively towards a common purpose.
Safety is a control problem, or can be viewed as such and safety is managed by a control structure. Investigators therefore should be able to identify why the existing control structure failed, or which parts of it failed. To prevent future error requires a control structure that can be designed that will enforce the necessary constraints on the system to ensure safe operation and can continue as such as changes occur.
What then of human error within the sociotechnical system, how do we understand it. Often it is discoverable evidence of the adverse event which leads to a finding of e.g. human error, but what we don’t seem to analyse so well is people’s intentions and behaviours which, do not come about from factual evidence per se, e.g. what motivated our decision or behaviour.
How do we understand this in the context of the investigation, identifying this can be extremely challenging, identifying what motivated how we perceived a situation? However, this can lead to a richer understanding of the influences on human behaviour.
I have been looking at models that can be inserted into a core investigative doctrine for safety and to date I have settled on that posed by Dr Russell Kelsey MB.BS.MRCGP a subject matter expert in serious clinical investigation which places clinical error at the centre of three influences of attitude, attention and cognition and the effects of various biases within those three influences and situational awareness and high pressure environments.
I am not connected to Russell and also do not want to be seen as breaching any copyright within this but more information can be found in his book at https://www.amazon.co.uk/Patient-Safety-Investigating-Reporting-Incidents/dp/1498781160 which I have found very informative.
Has anyone developed a method/model that looks deeper at the context behind human error as I would be interested in the approach. The new PSIRF will transform our approach refocusing on systems, processes and behaviours and whilst early adopters are trialling the framework it does not stifle discussion and consideration for improvements and would be interested of any developments already within investigation management .
Investigations & Learning Specialist- RWHT