The aim of this study, published in Human Factors journal, was to examine the effects of interruptions and retention interval on prospective memory for deferred tasks in simulated air traffic control. This can be translated into a healthcare environment.
From the 5365 operations, 188 adverse events were recorded. Of these, 106 adverse events (56.4%) were due to human error, of which cognitive error accounted for 99 of 192 human performance deficiencies (51.6%). These data provide a framework and impetus for new quality improvement initiatives incorporating cognitive training to mitigate human error in surgery.
In this article, Dan looks back at the Donabedian Model, a framework for measuring healthcare quality, and suggests why this might be an over simplification and why we must also look at human factors when we think about patient safety. We are humans and we can, do and will make mistakes, so we have a personal responsibility to acknowledge and address this as a contributing factor for patient safety incidents and harm.
How do we begin to address our individual responsibilities? How can each of us reduce the personal risks we pose for our patients? How do we begin to address the moral impe
What can I learn?
Managing human failures
Fatigue and shiftwork
Safety critical communications
Human factors in design
Maintenance, inspection and testing
The book blends literature on the nature of practice with diverse and eclectic reflections from experience in a range of contexts, from healthcare to agriculture. It explores what helps and what hinders the achievement of the core goals of human factors and ergonomics (HF/E): improved system performance and human wellbeing. The book should be of interest to current HF/E practitioners, future HF/E practitioners, allied practitioners, HF/E advocates and ambassadors, researchers, policy makers and regulators, and clients of HF/E services and products.
In this blog, Steven questions:
Are we reducing the human to ‘human error’?
Are we reducing the human to a faulty information processing machine?
Are we reducing the human to emotional aberrations?
Are we reducing human involvement in socio-technical systems?
The widespread implementation of CPOE thoughout the US has benefited clinicians and patients, but it also vividly illustrates the risks and unintended consequences of digitising a fundamental healthcare process, this paper published in PSNet explains how and why.
Key highlights from the paper
Accountability relationships, as both retrospective and prospective, support just culture.
Lines are fluid in accountability relationships, forcing operators to adapt to changing goals.
Viewing accountability lines as rigid, increases risk and creates double-binds for operators.
Clinging to retrospective accountability reinforces blaming/shaming operators for errors.
I started my career in a care of the elderly ward (geriatrics), which was exciting as my first job, and I felt that my time management needed to be worked on prior to me starting my career in what I knew at the time to be emergency nursing. I stayed in this area for a year, taking charge of the shift and also managing a bay of eight patients, which was the norm (or so I thought). After about 1 year, I thought about moving on, continuing to learn, and I started working in an intensive care unit (ICU).
During my time in ICU, I made a drug error involving a controlled drug. Without going in
Key learning points
Two approaches to the problem of human fallibility exist: the person and the system approaches.
The person approach focuses on the errors of individuals, blaming them for forgetfulness, inattention, or moral weakness.
The system approach concentrates on the conditions under which individuals work and tries to build defences to avert errors or mitigate their effects.
High reliability organisations—which have less than their fair share of accidents—recognise that human variability is a force to harness in averting errors, but they work hard to focus that v