Search the hub
Showing results for tags 'Human error'.
-
Content Article
The Authors, conclude that whilst healthcare has much to learn from aviation in certain key domains, the transfer of lessons from aviation to healthcare needs to be nuanced, with the specific characteristics and needs of healthcare borne in mind. On the basis of this review, it is recommended that healthcare should emulate aviation in its resourcing of staff who specialise in human factors and related psychological aspects of patient safety and staff well-being. Professional and post-qualification staff training could specifically include Cognitive Bias Avoidance Training, as this appears to play a key part in many errors relating to patient safety and staff well-being.- Posted
-
- Link analysis
- Assessment
- (and 7 more)
-
Content Article
The use of artificial intelligence (AI) in patient care can offer significant benefits. However, there is a lack of independent evaluation considering AI in use. This paper from Sujan et al., published in BMJ Health & Care Informatics, argues that consideration should be given to how AI will be incorporated into clinical processes and services. Human factors challenges that are likely to arise at this level include cognitive aspects (automation bias and human performance), handover and communication between clinicians and AI systems, situation awareness and the impact on the interaction with patients. Human factors research should accompany the development of AI from the outset.- Posted
-
1
-
- AI
- Human error
-
(and 3 more)
Tagged with:
-
Content Article
Reflecting on the Bawa-Garba case
PatientSafetyLearning Team posted an article in Legal matters
- Posted
-
- Human error
- Doctor
- (and 5 more)
-
Content Article
Patient Safety: 20 Years After “To Err is Human” (2019)
PatientSafetyLearning Team posted an article in Culture
- Posted
-
- Human error
- Safety culture
- (and 4 more)
-
Content Article
Presented by Sidney Dekker, Safety Differently: The Movie tells the stories of three organisations that had the courage to devolve, de-clutter, and decentralise their safety bureaucracy. It is a story of hope; of rediscovering ways to trust and empower people and of reinvigorating the humanity and dignity of actual work.- Posted
-
- System safety
- Work / environment factors
- (and 5 more)
-
Content Article
In this blog, Steven questions: Are we reducing the human to ‘human error’? Are we reducing the human to a faulty information processing machine? Are we reducing the human to emotional aberrations? Are we reducing human involvement in socio-technical systems?- Posted
-
- Human error
- Heuristics
- (and 5 more)
-
Content Article
National data from SHOT (Serious Hazards of Transfusion) indicates there were 792 ‘wrong blood in tube’ near misses (where the error was spotted in time and no patient suffered harm) relating to blood transfusion samples, in 2018 across England. This doesn’t account for blood samples taken for any other purpose. The HSIB report showed why these incidents happen and most importantly what can be done to reduce the risk of it happening again. The investigation looked at all the factors involved and found evidence to show that electronic systems could help staff in busy environments, by making the processes easier and more efficient, to manage and reduce the risk to patients.- Posted
-
- Near miss
- Blood / blood products
- (and 4 more)
-
Content Article
Human error: models and management
Claire Cox posted an article in Improving patient safety
Key learning points Two approaches to the problem of human fallibility exist: the person and the system approaches. The person approach focuses on the errors of individuals, blaming them for forgetfulness, inattention, or moral weakness. The system approach concentrates on the conditions under which individuals work and tries to build defences to avert errors or mitigate their effects. High reliability organisations—which have less than their fair share of accidents—recognise that human variability is a force to harness in averting errors, but they work hard to focus that variability and are constantly preoccupied with the possibility of failure.- Posted
-
- Cognitive tasks
- Distractions/ interruptions
- (and 7 more)
-
Content Article
The aim of this study, published in Human Factors journal, was to examine the effects of interruptions and retention interval on prospective memory for deferred tasks in simulated air traffic control. This can be translated into a healthcare environment.- Posted
-
- Human error
- Memory
- (and 5 more)
-
Content Article
Patient Stories: Paul's Story (10 March 2013)
Claire Cox posted an article in Patient stories
-
Content Article
Letter from America: Lift off!
lzipperer posted an article in Letter from America
“One small step for man ... “ 50 years on – we all recognise this phrase that accompanied one of the most famous descents in history: Neil Armstrong’s emergence from the lunar module toward his first step on the moon. The Apollo 11 moon landing represents an unparalleled accomplishment. Its characteristics resonate with patient safety professionals who look to space for inspiration. The Apollo programme experienced both triumphant achievement and catastrophic failure. The effort learned from mistakes, embraced teamwork, and considered human factors as part of its domain. Its workforce remained focused on a single goal. The effort embodied commitment, complicatedness and complexity. The 50th anniversary of these victories provides compelling parallels for error reduction efforts active today in healthcare in the US: Organisational learning systems NASA (National Aeronautics and Space Administration) is a learning system. Learning systems are developed and nurtured through common goals, leadership commitment and resource sustainability. They thrive through action generated by the application of data, evidence and knowledge. Likewise, the US Agency for Healthcare Research and Quality (AHRQ) has partnered with the US-based hospital and healthcare accreditation organisation, The Joint Commission, to disseminate analysed evidence compiled by the Evidence-based Practice Center (EPC) programme. These organisations are working together to transfer what is known into an actionable form through a series of articles to enhance the use of better practice and learning on the frontline. This programme and the article series are introduced in a recent commentary on the project. Coordinated action The Keystone Center represents the culmination of the work of patient safety’s own Neil Armstrong – Dr Peter Pronovost, known for his otherworldly (at the time) commitment to the checklist intervention. The Keystone Center initially coordinated and collected data to guide the implementation of the checklist concept in 70 intensive care units across the state of Michigan. Now the Center serves as the state’s mission control for hospital patient safety and quality. Leaders there raise awareness of success through the Speak-Up! award programme that acknowledges frontline healthcare staff for voicing their concerns and making care safer. The Center enables sharing of concerns that result in cost savings due to harm avoidance. A push in the right direction The Apollo programme applied technical sophistication, engineering and know-how to land a man on the moon and return safely to Earth within a decade. No small feat! Despite that imperative, both the module and the space programme needed a little boost now and again to get out of Earth’s orbit to complete its momentous undertaking. Patient safety has a similar call motivating its work – zero preventable harm. Some aim for ‘zero harm’ but is this achievable? Healthcare is very complex with multiple machine/human/machine interfaces. Clinicians, leadership and organisations still need a boost to design and use technology and data to support the workforce to improve care at the bedside. The mission-driven, Boston-based Betsy Lehman Center builds on a strong desire to prevent failures similar to those that took the life of its namesake – Betsy Lehman – the Boston Globe reporter who died in 1994 due to medication errors. The Center is a state agency that serves as mission control for its constituents. To help healthcare in Massachusetts move its safety work beyond the comfort of the status quo, they have recently convened a consortium to propel existing programmes towards new and aspirational achievement. On the dark side of the moon Of course, the Apollo programme suffered setback and tragedy. While I want to highlight successes in my Letter from America, I will also share stories of struggle to foster learning from what doesn’t work. News and narrative will often remind us of why continued work on safety improvement is fundamental. Diagnostic error is prevalent. A recent analysis of closed US medical malpractice claims found that delayed or missed diagnoses in three primary clinical areas – vascular events (such as strokes), infections (like sepsis) and cancer – substantially resulted in disability or death. You can take that to your mission control to motivate data collection, teamwork and effort to focus on diagnostic improvement in practice. Transparency is messy. The revelation of Neil Armstrong’s reported death in 2012 due to substandard medical care is sad for all kinds of reasons. It underscores persistent cultural influences that reduce the sharing of information related to poor care. This minimises our opportunity to learn from failure and support patients, families and clinicians involved in error. Organisational resistance to transparency about mistakes and the messiness of openness are challenges... even when the incident involves a patient with less name recognition. The Apollo programme and the 1969 lunar landing remains inspirational to this day. It behooves all of us who dream of contributing to something we once felt was impossible to engender the right spirit, resources and commitment to help get it done. The learning required for such accomplishment takes time, a culture that supports discussion and recognition of success. If we embrace contribution, collaboration and community, our small steps have the potential to contribute to the “giant leap” forward – to help us take off, realise achievement and return our patients safely home.- Posted
-
2
-
- Diagnosis
- Checklists
- (and 4 more)
-
Content Article
PSNet: Systems Approach
Claire Cox posted an article in In health care
In this article they use this case to highlight the importance of analysing errors using a systems approach. James Reasons 'Swiss cheese model of medical errors' is explained and put into context.- Posted
-
- Surgery - General
- Patient harmed
- (and 6 more)