Jump to content
  • Errors as clues in the search for safety measures: Measuring safety part 2


    NMacLeod
    • UK
    • Blogs
    • New
    • Health and care staff, Patient safety leads

    Summary

    In a three-part series of blogs for the hub, Norman Macleod explores how systems behave and how the actions of humans and organisations increase risk. 

    In part 1 of this blog series, Norman suggested that measuring safety is problematic because the inherent variability in any system is largely invisible. Unfortunately, what we call safety is largely a function of the risks arising from that variability. In this blog, Norman explores how error might offer a pointer to where we might look. 

    Content

    Safety as risk propagation

    It is common in safety management to talk in terms of hazards. We can identify three classes of hazards: substances or objects that could cause loss or harm; engineered situations where humans engage in activity involving known hazards but under controlled conditions; acts by individuals that inadvertently expose the operation to a hazard (we might call these ‘errors’). Controls are put in place to contain hazards but controls are designed by humans and are fallible.

    Healthcare is an example of a hazardous condition: things are done to patients that would be illegal if inflicted upon a healthy person. Procedures act as controls in these situations but there is always a tension between work-as-imagined (WAI) and work-as-done (WAD). WAI describes the least-risky solution to a problem that will work in most circumstances (or, at least, those envisaged by the procedure designers), whereas WAD reflects the inherent flexibility needed in the real world.

    In a study of maritime accidents,[1] it was found that collisions have occurred between ships actively trying to follow the ‘rules of the road.’ Procedures contain affordance spaces, or lacunae, that must be filled by actors applying expertise. Procedures, or rules, form a hierarchy. At the top there are rules about goals: ‘first, do no harm.’ Then there are IF-THEN rules that aid decision-making: IF <symptom> THEN <condition>. The lowest order of rules are task prescriptions: step 1, step 2, step n. As we ascend the hierarchy, actors need more extensive training to cope with the lacunae that invariably exist.

    Many airlines use a process called the Line Operations Safety Audit (LOSA).[2] Trained observers monitor flight crew under normal flight conditions and log departures from procedures, crew responses and subsequent outcomes. In most cases, 95% of errors are inconsequential: error is very much noise in the system. LOSA can let us see what happens when crew attempt to fill in the gaps in procedures. The observer can tag an error as 'intentional’ (an INC) if certain criteria are met and figures of between 8.8% and 26.4% of INC errors have been seen. However, ‘Intentional’ errors are usually attempts to adapt to local circumstances or to solve problems. These departures from prescribed activity reflect system buffering.

    The outcome of an error can be categorised in LOSA as ‘inconsequential’, can trigger an additional error or results in an ‘Undesired Aircraft State’ (UAS) if the observer feels that safety has been jeopardised. In one study I looked at UASs arising from INCs versus non-intentional errors. INCs were twice as likely to result in a UAS. I then looked at who committed the error. For INCs, captains accounted for 91.66% of UASs compared with 40.6% when the error was non-intentional. The data suggests that agents actively choose courses of action that contravene procedures to maintain the flow of work but those decisions increase risk. Captains are over-represented in the data because they are the primary decision-makers in the team. Ironically, compliance with procedures is often the starting point for any safety investigation. However, rather than police ‘compliance’, organisations should probably find ways to capture variability and render it as knowledge.

    What error does 

    To view error simply as failure, however, is to miss the fact that they change the work process in a way that needs to be addressed if safety is to be maintained. This can happen in one of three ways. First, they reduce performance margins. Even slight departures from the optimum aircraft configuration mean that, should a subsequent event occur, the crew have less flexibility to respond. In the flight data shown in the previous blog, an aircraft operating in the outer bands of the distribution is migrating towards the margins of the safe space. Something as commonplace as a change in windspeed or direction could result in a critical outcome.

    Second, error transfers risk when my action affects others. For example, passengers have been killed when aircraft have flown into turbulence. If a pilot delays or fails to turn on the seat belt sign in time the cabin crew and passengers are exposed to risk because they will not have taken steps to protect themselves (such as sitting down or fastening seat belts).

    Sometimes, and in contravention of procedures, pilots start the ‘after landing’ checklist early to save time. This usually results in pausing the checklist while air traffic control issues directions to the terminal building. LOSA shows that crew then often forget to finish the checklist and aircraft park with the weather radar still turned on, exposing the ground handlers to a radiation hazard.

    Finally, separation reduction describes the condition where aircraft are placed in closer proximity to hazardous objects (other aircraft, the ground) than was intended. Again, should something happen, the crew will have less time to react. Error, then, can reveal how the risk profile is shaped by the deliberate actions of crew.

    What goes on here?

    This examination of normal work suggests two candidate domains for measures of safety. First, what is the organisation’s understanding of the utility of its control structures (policies and procedures, codes of conduct)? How well-written and comprehensive are the structures? Where are the contradictions and ambiguities that flow from multiple stakeholders in the process of oversight?

    Second, what is the skills mix of those required to work within the system, recognising the need to cope with the variability inherent in the real world. Does the organisation have a competence model for the different functions in the system? What are the risks associated with substituting staff (bank staff, staff on loan)?

    Conclusion

    In this post I have looked how workplace variability shapes risk. I have suggested two key aspects of the structure of an organisation – control and competence – that could be candidates for measuring ‘safety’. In my final blog I want to explore how organisations actively design unsafety into their operations.

    References

    1. Belcher P. ‘A Sociological Interpretation of the COLREGS”. Journal of Navigation, 2002; 55(02): 213-224.
    2. Klinect JR, 1st Klinect JR. Line Operations Safety Audit: A Cockpit Observation Methodology for Monitoring Commercial Airline Safety Performance. Unpublished PhD thesis, 2005. University of Texas. Unpublished PhD thesis. University of Texas.

    Read part one and part three of Norman's blog series.

    About the Author

    Norman MacLeod served for 20 years in the RAF involved in the design and delivery of training in a variety of situations. He stumbled across 'CRM' in 1988 while investigating leadership in military transport aircraft crews. From 1994, he worked around the world as a consultant in the field of CRM in commercial aviation, latterly employed as the Human Factors Manager for a blue chip airline in Hong Kong. Now semi-retired, he is one of the Patient Safety Partners at James Cook Hospital in Middlesborough.

    0 reactions so far

    0 Comments

    Recommended Comments

    There are no comments to display.

    Create an account or sign in to comment

    You need to be a member in order to leave a comment

    Create an account

    Sign up for a new account in our community. It's easy!

    Register a new account

    Sign in

    Already have an account? Sign in here.

    Sign In Now
×
×
  • Create New...