This is part of our series of Patient Safety Spotlight interviews, where we talk to people working for patient safety about their role and what motivates them. Mark talks to us about how he came to work in healthcare, the vital role of safety scientists and human factors specialists in improving patient safety, and the challenges involved in integrating new technologies into the health system.
About the Author
Mark Sujan is a Chartered Ergonomist and Managing Director of Human Factors Everywhere. Mark is a Trustee of the Chartered Institute of Ergonomics and Human Factors, and leads the Institute’s special interest group on Digital Health & Artificial Intelligence. Mark also works part-time in the NHS as senior educator at the Healthcare Safety Investigation Branch. He is a visiting academic at the University of Oxford and honorary Associate Professor at the University of Warwick.
Questions & Answers
Hi Mark. Please can you tell us who you are and what you do?
My name is Mark Sujan and I’m a Chartered Ergonomist. I currently run my own company called Human Factors Everywhere—we engage in applied research around human factors and safety across different industries including aviation, petrochemicals and healthcare. I also work part time for the Healthcare Safety Investigation Branch (HSIB) providing investigation science education to HSIB’s investigators and NHS staff with an interest in safety investigation. I am a trustee for the Chartered Institute of Ergonomics and Human Factors (CIEHF), the professional body for the discipline, and lead their Artificial Intelligence and Digital Healthcare Special Interest Group. I also have a number of roles in academic institutions that help me keep in touch with the human factors community and research environment.
What do the terms ‘human factors’ and ‘ergonomics’ mean?
The two terms ‘human factors’ and ‘ergonomics’ mean largely the same thing in the UK and are often used interchangeably. They both refer to a scientific discipline and the professional practice that then applies these scientific insights to achieve what is referred to as the ‘dual aims’: to optimise system performance and to improve human wellbeing.
As ergonomists, we try to apply a systems perspective to think about how people approach work and get work done. This means that rather than thinking about specific issues in isolation, we look at the wider picture and consider how people interact with tools, tasks, their organisational setting and their physical environment, to get work done.
How did you first become interested in patient safety?
I would like to give you an exciting story, but I became involved in patient safety through pure coincidence! I started out working in railway safety then moved to the aviation and petrochemical industries, where human factors and ergonomics and safety are recognised disciplines.
Then when the National Patient Safety Agency (NPSA) was established in 2002, they asked me to do a project for them on patient identification. After that work for the NPSA, I was offered more opportunities to work in healthcare as interest in patient safety began to grow and there were very few people working in the field who had a background in human factors.
There was an opening at Warwick Medical School for a lecturer in patient safety; the university was keen to have someone with outside expertise and was looking for human factors expertise rather than a clinician with an interest in safety. I got the job and spent about 15 years at Warwick, where I developed the Masters module on “A systems approach to patient safety”. It was only the second of its kind in the UK and it proved to be quite interesting—it was novel, dynamic and had an immediacy in its human interest.
However, there were definite challenges in working at a medical school for someone who isn’t a doctor! One of the biggest challenges was that human factors and ergonomics weren’t properly recognised by some, who thought that what I was doing didn’t count as science. It was a bit of a culture shock, but I did have many helpful colleagues as well. I also learned a lot and can see that the rigour of medical research is something we can learn from in the field of human factors. I often challenge safety scientists in other industries on whether they have solid evidence to back up their claims. Working in healthcare has also shown me that human factors approaches and tools need to be adapted to different settings, and a strong case has to be made as to why those tools will be beneficial for use in particular systems.
Healthcare is also an extremely challenging environment in which to conduct observations and research. When I was working in aviation, pilots and air traffic controllers could just take time out to talk to you because it was in their contract. There were dedicated labs, specialist staff and time set aside to talk about system safety. Healthcare is very different! When you walk into an emergency department, you need to grasp very quickly what’s going on. Pressurised doctors and nurses have very little time to explain to you what’s happening.
What part of your role do you find the most fulfilling?
When I was working in aviation, all my work was quite abstract and theoretical. It always felt very distant, whereas in healthcare and patient safety, my work is affecting people in a more direct way. Although I am not a clinician, I am in a position—one step removed—to make a difference. My work affects clinicians and the way they work, and therefore affects the way patients experience care and the outcomes they receive. That’s a very rewarding feeling.
Some feel that not much has changed in terms of how human factors and ergonomics are viewed in healthcare, because, for example, there are still only five human factors specialists employed by the NHS. But if I look back 20 years to when I started working with the NHS, I can see big improvements in general awareness, the number of people involved, the quality and depth of conversation you can have with people and the amount of research and literature around the topic. The healthcare sector is becoming more receptive to human factors and is finding its role in driving improvement. There are now many excellent publications in this field, which are also read by specialists from other industries.
What patient safety challenges do you see at the moment?
I have a particular interest in artificial intelligence (AI), which is a big area of development in healthcare. Making sure it is used safely and effectively will be a challenge, as the current landscape feels a bit like the wild west. There are AI technologies popping up everywhere and many of these are being developed by companies with little experience of healthcare or medical devices. Equally, healthcare as a sector has little experience of these new technologies, so the skill sets and frameworks for their use are still developing.
From a human factors and ergonomics perspective, I am disconcerted by a seemingly narrow focus when it comes to AI. Many of the conversations I come across seem to be missing out on the decades of knowledge that we have on the use of highly automated systems in other industries. I am hearing discussions about AI technologies where the substitution approach—or myth—is being used. People are asking, “Can the AI do what is currently being done by a clinician?” But to get the most from AI, we need to be asking questions from a systems perspective: “Will disruptive technologies (innovations that significantly alter the way a system operates) improve important system outcomes, as well as human wellbeing?” I haven’t seen these discussions currently happening in the mainstream.
A further problem is that most of the evidence we have is retrospective. To test an AI technology, most researchers procure data and test retrospectively whether it performs as well as a human would have done. There are very few prospective studies and those that we have are quite sobering. One example is the use of AI in ambulance call-handling. There is a concern that call handlers don't recognise cardiac arrest calls quickly enough, as every minute of delay in getting help greatly reduces a person’s chances of survival. So AI tools have been developed to support call handlers in recognising cardiac arrest. Retrospective evaluation of these tools have shown they recognise cardiac arrest calls faster and more reliably than human call handlers. While that is good news, a recent prospective study in one call centre was unable to replicate these findings; the use of AI did not improve outcomes. It demonstrates a clear need to understand how to better design, integrate and validate AI tools into a clinical system so that it works for everyone. It’s not just about substituting people with AI.
Taking a systems approach, the question should maybe be, “Why do call handlers find it hard to recognise cardiac arrest calls?” I talked to call handlers and they identified a range of issues, such as the caller struggling to acknowledge the seriousness of the situation a loved one is facing, being unable to give precise answers, slurred speech and poor mobile phone reception. Once we have established the causes of the issue, the question then becomes, “Can we support call handlers to manage that better?” For example, there are AI technologies that can help clarify slurred speech and poor reception, so a more nuanced hybrid AI-human system might result in a greater level of improvement.
What do you think the next few years hold for patient safety?
There will be a lot of new technologies, particularly focused on freeing up healthcare workers so they don’t waste time having to work around poorly designed IT systems. There are a few initiatives underway to help companies design and adapt technologies - NICE has developed the Evidence Standards Framework (ESF), a risk-based approach that allows tech developers to understand the evidence they need to produce. NHS Digital also has clinical safety standards, but they need to be more widely known within the health tech community.
There will also be new ways of delivering services—many will be moved into communities—and this presents both a challenge and an opportunity. Underpinning all of this is a rapidly changing socio-economic climate. We know there are pressures on finance, productivity and capacity. In the NHS we have problems with the diagnostic workforce, coupled with equipment shortages. The implications on the workforce are very serious—we will see burnout and adverse health effects in staff, which will in turn affect patient safety. These pressures need to be recognised and dealt with in a very dynamic and agile fashion. There’s an interesting safety model by Jens Rasmussen that we could benefit from in the NHS.
If you could change one thing in the healthcare system, what would it be?
One area where I hope we will see some change is that human factors and ergonomics and safety science become even more accepted as professions and disciplines. The aviation and nuclear industries employ whole departments of human factors specialists. Embedding that level of expertise in the NHS would be a major step forward.
We are currently pursuing a model where we train up enthusiastic healthcare staff to become more knowledgeable in human factors and safety science. While this is good and leads to a more receptive environment, ultimately organisations need safety science specialists embedded as accepted members of the team. They can then be accessible to other members of staff and can drive forward a systems perspective.
There are already activities underway and stakeholders are having conversations about this issue. The CIEHF has developed a healthcare learning pathway and HSIB is running courses specifically about patient safety investigations based on a systems perspective. We are also trying to build capability within the NHS by encouraging people to become technical specialists. It’s a two-way process; if there are more people aware of human factors, we have more advocates to make the business case for recruiting human factors specialists.
You can read the CIEHF White Paper 'Human Factors and Ergonomics in Healthcare AI' to find out more about the role of human factors in health technologies.
1 Rasmussen, J. Risk management in a dynamic society: a modelling problem. Safety Science. 1997:27:183-213