Jump to content
  • Patient Safety Spotlight interview with Clive Flashman, Chief Digital Officer at Patient Safety Learning


    • UK
    • Interviews and reflections
    • New
    • Everyone

    Summary

    This is part of our series of Patient Safety Spotlight interviews, where we talk to people working for patient safety about their role and what motivates them. Clive talks to us about the important role of digital technologies in tackling the big issues healthcare faces, the need for digital tools and records to be joined-up and interoperable, and how his experiences as a carer have shaped how he sees patient safety.

    About the Author

    Clive runs his own digital strategy consultancy, Flashfuture Consulting. He is Director of Strategy for ORCHA, an organisation that reviews health and care apps, Director of Healthcare Strategy for AlphaLake AI, a company that develops healthcare AI solutions, and Partnerships and Alliances Manager for Syndi Health, a self-referral platform for mental health.

    Clive also works as a Digital Innovation Advisor for NHS South, Central and West and as a mentor for London Southbank University, the University of Oxford and Sheffield Hallam University. He also lectures in digital health at the University of Cumbria and is non-executive director for two health tech organisations. Clive is a past President of the Royal Society of Medicine’s Digital Health Council.

    Questions & Answers

    Hello Clive. Please can you tell us who you are and what you do?

    I’m Clive Flashman and I’m the Chief Digital Officer for Patient Safety Learning, among many other roles! I have a portfolio career with a wide range of clients, and supporting healthcare organisations with their digital strategy is the common thread that runs through all my work.

    How did you first become interested in patient safety?

    In 2002, I started working as Head of Knowledge and Information Management at the National Patient Safety Agency (NPSA). My job was to build a team of experts to create a new national system for reporting and learning from patient safety incidents. To design an effective system, we had to really understand what was going on with these incidents, and then create a unified taxonomy that all of healthcare could use. It was a challenging project, but from start to finish, we designed and built the National Reporting and Learning System (NRLS) in just nine months! The NRLS has been in use for the last 20 years and during that time it has gathered up to two million reports a year. It’s currently being decommissioned and replaced by a new system, Learn from patient safety events (LFPSE).

    Working on the NRLS to understand the kinds of incidents that happen in healthcare sparked a determination in me to contribute to patient safety. If other safety critical industries could break the back of these types of incidents, then surely healthcare could as well? Unfortunately, we’ve been trying to tackle avoidable harm for 20 to 30 years now, and the statistics in NHS England’s current National Patient Safety Strategy are not very different to the numbers we were talking about two decades ago when the NPSA was established. It’s very disappointing that we haven’t managed to significantly reduce the number of unavoidable deaths, and I want to do anything I can to improve that situation.

    Before I entered the world of patient safety and digital health, I had worked in finance and in a professional services firm within the emerging domain of Knowledge Management. Then in the early 2000s, I had to take a large chunk of time off work as my wife had spinal surgery and was left with ongoing health issues. This gave me first-hand experience of the challenges patients face when trying to navigate the complexities of the health and care system. 

    Trying to access services designed to give financial support is incredibly challenging for people already facing long-term health issues. The Personal Independence Payment (PIP) process is frankly horrific; as a patient you try and think positively about what you can do in spite of your limitations, but PIP asks people to focus on what they can’t do. It’s really not good for your mental health. On top of that, if you have any form of post-viral fatigue or brain fog, working out the system and gathering evidence is really hard. It’s a problem that more and more people are facing as the reality of Long Covid hits us. There are over a million people with symptoms of Long Covid in this country, and it will be very difficult for some of them to navigate the PIP system with the disparate symptoms they have.

    Which part of your role do you find the most fulfilling?

    The thing that gives me most pleasure is knowing that I have made a tangible difference to patients who are facing health issues. At Patient Safety Learning, it’s about sharing knowledge that helps people, for example, to understand how to better manage their interactions within the health and care system. That’s a key part of our role as an organisation - helping patients see how policy and practice changes will affect their lives.

    With my work at Alphalake AI, it’s about automating processes so that patients have a much easier time when they are trying to interact with the health and care system. At the moment, patients face so many unnecessary hurdles, for example, in getting diagnostic test results from their GP. For so many people, the process involves spending hours on hold for an initial consultation, then more hours on hold to book a blood test, and then more waiting to get the results. Alphalake AI designs systems that can streamline processes like this, by allowing people to book their blood tests online, get their results sent to their phone and letting them know whether a follow-up appointment is needed. That’s much better for patients than waiting on the phone for hours and struggling to get through to their GP surgery. That’s the kind of thing I find rewarding—making a real difference to how people interact with the system.

    What patient safety challenges does the health system face at the moment?

    The huge challenge facing the NHS—and healthcare globally—is workforce scarcity. In the UK, we just don’t have enough staff to deal with the number of people who need care. We had shortages before Covid-19 and then many clinical staff resigned during the pandemic. That means we now have an additional 30,000 vacancies, in addition to the 100,000 we had before the pandemic. This staff shortage puts patients at risk—for example, if you don’t have the right ratio of nurses to patients, the consequences for patient care could be very serious.

    The elective backlog is another major problem. At the moment, people are waiting months after referral, if not years, for treatment. It’s going to take at least five years to get through the backlog, and probably longer. We currently have 6.5 million patients in the backlog, but those are just the ones we know about. It’s likely that we’re facing an iceberg situation, where we are only seeing a small part of the problem, as many people didn’t go to the doctor during the pandemic. The backlog could in reality be 10 million or more, and it’s going to take a long time to get through those waiting lists.

    I was talking recently to a mental health practitioner who told me the average wait to be seen by mental health services is between six and 36 months, if you are not at urgent risk. This situation is inherently dangerous; if you don’t see someone for three years, your condition will change, you could deteriorate, and unless you went back to your GP, no one would know. No one is keeping track of patients in this ‘referral limbo’ to see how they’re doing. Some of the solutions I’m working on are around enabling clinicians to keep track of patients during this time. They might need to be moved up the list, and AI tools can reprioritise the elective backlog based on urgency of need rather than length of time in the queue. At the moment we aren’t monitoring that at all, and part of the problem is that we aren’t collecting the real time data that will allow AI tools to help us make those decisions.

    Other major challenges the NHS faces are fragmentation of information and lack of interoperability. As a patient, you might see a wide range of healthcare professionals in different settings over a period of time. You may also be using health apps and online services. Potentially, each of those care providers use different systems to store your data, meaning your health record is fragmented. That means the clinicians who interact with you don’t have a complete picture of you as a patient, and may therefore make inappropriate clinical decisions. Overcoming these issues would make a huge difference to patient safety in the UK.

    What do you think the next few years hold for patient safety?

    I think we’ll see an increasing use of digital technology in the health system. If we use digital tech in the right places, it will help us to partially mitigate issues such as lack of staff. But the NHS will need to take a multichannel approach, as some people will never be comfortable with using digital technology. We will need a hybrid approach that offers patients choice in how to interact with the healthcare system.

    In the next few years I think we will also see an increasing reliance on AI to augment what clinicians can do. For example, if AI can offer healthcare professionals better ways to predict outcomes, it will help them make better decisions. But there are ethical and practical issues that we need to overcome before we can use AI in this way, particularly around clinical and legal responsibility. 

    For example, if a health app uses AI to make recommendations to a patient, who is responsible if something goes wrong? Is it the responsibility of the NHS trust that commissioned the app, the app developer, the regulators, or the patient? I think some of the answers will end up coming from case law, but before that happens, we need to have serious discussions about how AI should be regulated. At the moment there isn’t enough clarity for AI innovators, who need to understand how to develop their product to be as safe as possible.

    If you could change one thing in the healthcare system right now to improve patient safety, what would it be?

    I would specify that no application, software or system can be implemented unless it is fully interoperable. We must make sure these systems can be connected as easily as possible; all the time they are not, there’s a patient safety incident waiting to happen.

    Are there things that you do outside of your role which have made you think differently about patient safety?

    Being a parent has definitely made me see patient safety differently! When you are responsible for another human life, you start to think about how services can be created and delivered for people who may not understand them in the way that an adult would.

    Being a carer for someone with chronic health conditions has given me a very specific focus on the way healthcare is delivered in this country. I see the various ways in which central and local government deliver support, and where it fails to offer what people need. We experience the safety implications of policy decisions first hand. For example, when the government ‘opened up’ the country in summer 2020, shielding suddenly finished and people who were still more vulnerable to Covid-19 were just left to get on with it without support. Keeping themselves safe became much more difficult than it could have been if the government had continued to offer support.

    Long Covid is another major issue that the government has not taken seriously. The message for the past year has been, “The virus is more mild, it’s OK to go out and get it.” But we don’t know enough about the implications of these milder strains on Long Covid. I can’t dispute that there was a need to restart the economy, but the government did nothing to mitigate the potential of people developing Long Covid, and I think that was a mistake.

    Tell us one thing about yourself that might surprise us!

    I’m a huge fan of archaeology and geology, and when I was younger I used to go fossil hunting. When I was about 10, I went on a dig in Kent—it had rained a lot before we arrived and we hadn’t realised that there was some really deep mud nearby. I stepped into the mud and sank up to my shoulders quite quickly. It was terrifying, I couldn’t even get my hand out! The people I was with had to very quickly get a land rover and carefully winch me out without strangling me. I certainly can’t watch movies that feature quicksand now!

    2 reactions so far

    0 Comments

    Recommended Comments

    There are no comments to display.

    Create an account or sign in to comment

    You need to be a member in order to leave a comment

    Create an account

    Sign up for a new account in our community. It's easy!

    Register a new account

    Sign in

    Already have an account? Sign in here.

    Sign In Now
×
×
  • Create New...