Jump to content
  • Summary

    Dan Cohen is an international consultant in patient safety and clinical risk management, and a Trustee for Patient Safety Learning. In this blog, Dan looks at the challenges around diagnostic error and delay, compounded by human factors, cognitive bias and the Covid-19 pandemic. Ending with a case study, he illustrates how high-quality investigations, that delve deeply into human factors and focus less on blame, are key to reducing harm.  

    This blog has been published as part of a series for World Patient Safety Day 2024 and the theme of Improving diagnosis for patient safety. #WPSD24, World Patient Safety Day 2024, WPSD 2024.

    Content

    Errors in diagnosis are all too common, and the consequences can be daunting and dangerous. In fact, most patients will experience at least one diagnostic error during their lifetime.[1] Even if an error does not result in immediate harm to a patient, it will of course cause the correct diagnosis to be delayed. Appropriate clinical interventions and lifestyle interventions may also then be delayed, potentially resulting in short and longer-term harm.

    Diagnostic errors, delays and patient harm are therefore all intrinsically linked.

    Human nature and cognitive bias

    The reasons for diagnostic errors are complex and include numerous inefficiencies in communication and collaboration among healthcare professionals, and between healthcare professionals and patients. Opportunity for error becomes even more apparent when you consider the various stages involved early on:

    • ordering investigations
    • performing investigations
    • receiving and interpreting the results of investigations
    • numerous human factors affecting professional performance.

    Perhaps, most importantly, inefficiencies in cognitive reasoning are often unrecognised and confounded by an array of human factors that may impair performance, reasoning, and judgment. All of which have the potential to lead to harm.

    Heuristic thinking is when a clinician takes cognitive shortcuts influenced by biases, that mostly lead to accurate decisions. Over-reliance on heuristic thinking and other cognitive biases are part of human nature, but this confuses the processes of reasoning, especially in stressed individuals,[2] and can lead to error.

    The impact of the Covid pandemic

    As most are aware, during the Covid pandemic, the safety of healthcare deteriorated substantially. Much of this was due to impairments in human performance related to fatigue, workload volume and task saturation. This stress can negatively impact a person’s cognitive reasoning, which is so necessary for making correct and timely diagnoses. Processes designed to reflect human factors failed during the pandemic, as healthcare systems were overwhelmed and the clinical workforce was pushed to the limits of effectiveness. Errors in healthcare increased. 

    Although the pandemic is now behind us, the UK’s National Health Service remains under considerable stress. Staffing, resourcing and funding are all under pressure. Waiting times for primary and secondary care, specialty consultations, imaging and investigations remain worryingly long in many areas. This is resulting in delays to diagnoses and provision of appropriate clinical interventions. 

    Investigations: an opportunity to learn not blame

    Investigations of harmful incidents related to diagnostic errors and delays all too often end with the responsibility and accountability being placed on the shoulders of one or more individuals. When this happens, investigations have failed to delve deeply enough. They have failed to understand the reasons for human errors. Perhaps most importantly, they have failed to identify opportunities for learning.  

    Lurking beneath the surface of human performance, but substantially and at times dramatically affecting human behaviour, are an array of 'hidden' human factors often not apparent without the deep-dive so necessary for succinct investigations.

    Case study

    I would like to share with you a case study, about a patient who died following a missed diagnosis. I hope it helps bring meaning to some of the key issues I’ve raised relating to diagnostic error, patient harm and the need for high quality investigations.

    A missed diagnosis of abdominal sepsis

    A morbidly obese Jamaican patient was admitted for a cholecystectomy. Surgery was complicated by access problems related to obesity, and at one point a tiny bowel puncture was suspected but not proven. Post-operatively, the patient complained of mild abdominal pain, nausea and anorexia. After three days, she was discharged. Her exam at discharge revealed mild abdominal tenderness, attributed to incisional pain. Thirty-six hours after discharge the patient became suddenly unresponsive, having developed a distended abdomen over several hours. She was readmitted with a diagnosis of bowel perforation and abdominal sepsis. Despite operative intervention, she died 48 hours after readmission.

    The hospital performed a root cause analysis that concluded that this 'unfortunate incident' was primarily related to a bowel perforation, a known complication of surgery. The surgeon had acted in good faith.

    The patient’s family filed a malpractice claim and a secondary investigation was performed as part of the hospital’s risk management process. This investigation noted that the patient’s vital signs had revealed a steadily increasing temperature and pulse and slowly decreasing blood pressure. On the morning of discharge, the patient had complained of anorexia to the nursing staff, had refused breakfast and her temperature was 38oC. The patient’s surgeon was apparently unaware of these symptoms and findings, having not discussed his patient with the nursing staff, and he had not referred to the nursing notes in the patient’s medical record.

    The combination of symptoms and findings should have suggested to the surgeon that the patient likely had abdominal sepsis. The patient’s septic death was possibly preventable. The patient died because the surgeon failed to pay attention to objective signs of evolving infection. The surgeon was responsible and accountable for what had happened. The hospital and surgeon were found to be culpable for malpractice, and a substantial claim was settled. 

    If this were the end of the story, and the 'root cause' was the surgeon’s inattention to details, then few opportunities for learning would have been identified, but it was not the end of the story.

    When the surgeon was interviewed as part of a professional competency investigation, the following factors were identified that may have contributed to his inattention to details:

    • The surgeon had been double-booked for his duties the day before the patient’s discharge, as his partner had become ill.
    • The surgeon also had taken an additional consecutive night on call the evening before the patient’s discharge and had been up until 4 am.
    • He had had two hours sleep prior to making ward rounds on the day of discharge and was due to travel that day to visit his father, who had been recently diagnosed with a serious illness, but the surgeon had not yet had a chance to pack for his early morning flight.

    Thus, he was exhausted, task saturated, emotionally distressed, and rushed.  Bias toward morbidly obese patients or patients of colour may have been factors, as well, but were not addressed.

    Yes, the surgeon was responsible for the care of this patient. Yes, he was accountable for what happened. Yes, he made some mistakes, but human beings make mistakes. Given the compelling circumstances confounding his performance that day, the cumulative effect of multiple unknown or “hidden” human factors that degraded his performance, his mistakes, though deeply regrettable, were understandable.   

    Furthermore, the professional competency evaluation revealed a culture problem within the institution. Admitting that one was overwhelmed and possibly impaired due to workload, social and physical factors was frowned upon by colleagues and discouraged. When one was on call, one was on call, and asking for help from professional colleagues was simply not done. In a broader sense, failings in the institutional and professional culture contributed to this terrible outcome.

    The surgeon was not subjected to professional sanctions, but he became terribly depressed as a result of this event and was referred for professional counselling as a 'second victim'.[3]

    Final thoughts

    Diagnostic errors and delays are the 'elephants in the room' and, unless we improve our accuracy and timeliness, substantial patient safety improvements will continue to elude us, even in the post-Covid era.

    Investigations focused on diagnostic failures, and delays in diagnosis and appropriate care, provide opportunities for real learning and improvement. Shoddy investigations that assign blame and do not delve deeply to understand human behaviour should be considered illustrations of poor understanding. They are best dismissed as ineffectual relics of the past never to be revived again.

    References

    1. Balogh EP, Miller BT, Ball JR, Committee on Diagnostic Error in Health Care, Institute of Medicine, The National Academies of Sciences, Engineering, and Medicine. Improving Diagnosis in Health Care. National Academies Press US, 2015.
    2. Groopman JE. How Doctors Think. Houghton Mifflin, 2007.
    3. Vanhaecht K et. al. An Evidence and Consensus-Based Definition of Second Victim: A Strategic Topic in Healthcare Quality, Patient Safety, Person-Centeredness and Human Resource Management. Int. J. Environ. Res. Public Health 2022, 19, 16869.

    Share your experience

    Have you been affected by a late diagnosis? Or perhaps you have insights to share on diagnostic safety through the work that you do. If you would like to write a blog or share your thoughts, experiences or resources through the hub please get in touch with our team at [email protected] or add your comments to our community forum page

     

    Further reading on the hub from Dan:

    About the Author

    International consultant in patient safety and clinical risk management, senior healthcare executive with extensive leadership experience, former US Department of Defense (DoD) physician executive with career culminating as Chief Medical Officer/Executive Medical Director for the DoD TRICARE health plan currently providing health care to over 9 million beneficiaries world-wide. Most recently served as Chief Medical Officer for Datix where he championed the company’s comprehensive patient safety thought leader efforts internationally through innumerable conference presentations, publications and commentaries.

    0 reactions so far

    0 Comments

    Recommended Comments

    There are no comments to display.

    Create an account or sign in to comment

    You need to be a member in order to leave a comment

    Create an account

    Sign up for a new account in our community. It's easy!

    Register a new account

    Sign in

    Already have an account? Sign in here.

    Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.