Module 1 – Systems Thinking in Learning from Events
by Professor Paul Bowie
Being familiar with the basic principles of ‘systems thinking’ is key to this process, particularly if our understanding and learning around safety is to be meaningful and effective.
Aim
To raise awareness of basic systems thinking concepts and practices that you can apply to your everyday work, including understanding complex risk and safety issues and other clinical and organisational problems.
Introduction
It is worth noting at the outset that ‘systems thinking’ and the ‘systems approach’ to learning from safety events are the predominant approaches to understanding why things go wrong in all modern high risk industries. However, understanding and application of this approach is limited in healthcare settings.
What is ‘systems thinking’?
In the context of safety in highly complex work systems like healthcare, systems thinking is a philosophy that can be summarised as follows:
1. Behaviour and safety are impacted by the decisions and actions of everyone in the system, not just frontline care workers.
In healthcare, this means that decisions and actions made by elected politicians, executive leaders, managers, educators and clinicians can play a role in patient safety incidents. Patient safety is the shared responsibility of everybody working within the care system.
2. Patient safety incidents are caused by multiple, interacting, contributing factors, not just a single bad decision or action by a clinician or someone else.
For example, a flawed decision made by a surgeon that led to an incident will likely have various upstream contributory factors, related to issues such as:
- the clinical condition of the patient
- training issues
- task complexity
- clinical guidance
- equipment
- time factors and resources
- policy targets.
In other words, there is no “root cause” of an incident in complex systems, and that ‘human error’ should never be seen as the cause of an incident. Rather, we need to search for the reasons why that error occurred. Interactions and relationships between contributory factors are as important to take into account as the factors themselves.
3. If learning is to be meaningful and improvement to be effective, counter measures need to focus on systemic change rather than the individuals involved.
Improvements to reduce the risks of incidents re-occurring should generally focus on policies, guidance, and infrastructure rather than on punishment, warnings, reminders or retraining. In some circumstances, refresher training may be necessary. However, we need to recognise that it is very difficult to change individual behaviour, especially if the design of the care system does not support changes in behaviour. It is not enough just to change work policies or rules and expect behaviour to change. We need to examine the factors that may potentially impact on the execution of those policies and rules, such as, staffing, management, resource or equipment availability, and local culture – or, “the way we do things around here”.
4. The systems approach is “up and out”, not “down and in”.
When learning from a safety incident, the unit of analysis is the care system as a whole, not solely the clinicians and others working in it. This involves going “up and out”, not “down and in”. The latter approach often leads to a focus on peoples’ behaviours, the hunt for a root cause and a common fixation on human error as the primary cause of incidents. As well as looking at the different system factors that may have contributed to an incident, we need to seek the views and experiences of the care team around incidents that were similar, for example, similar data from complaints or previous incidents.
This holistic view gives you a “window on the care system” and is more meaningful in terms of learning about the deficiencies in the system and how these can be redesigned.
5. Systems thinking does not seek to assign blame.
Systems thinking aims to identify how factors across the care system combine to contribute to a safety incident. It is impossible to blame and learn at the same time.
It is important to recognise that in healthcare the concepts and practice of systems thinking are not well established, despite being strongly promoted in the safety science world as both necessary for, and beneficial to, the complex nature of healthcare work. However, slow progress is being made.
Human factors and the systems approach
The discipline of human factors is a well-established science concerned with understanding and improving the ‘fit’ between people and their working environment to ensure a safer, more productive and efficient workplace. It is also known as ergonomics, with both terms being used interchangeably within the profession, as shown in the quote below from the International Ergonomics Association.
“Ergonomics (or human factors) is the scientific discipline concerned with understanding the interactions among humans and other elements of a system, and the profession that applies theory, principles, data, and methods to design in order to optimise human well-being and overall system performance”
A more accessible human factors definition for healthcare might be:
- “designing to fit people”
or
- “making it easy to do the right thing in a reliable and safe manner”
A systems approach is fundamental to the human factors discipline, a fact that is not well understood in healthcare. Therefore, adopting a systems approach to learning from events is the same approach as taking a human factors approach to learning from events.
Distinguishing features of the human factors approach
Below are three distinguishing features of human factors as a scientific discipline:
- The holistic systems approach to understanding and resolving problems.
- The goal of designing care systems, processes, equipment, job tasks and so on to take account of human characteristics, needs and capabilities.
- The focus on jointly improving both system performance issues such as safety, effectiveness or productivity and human wellbeing issues, such as health and safety, enjoyment, satisfaction etc.
Some research and development work has focused on the areas shown in diagram 1, such as assessing safety culture, designing medical devices and minimising work-related musculoskeletal injuries for surgeons.
Diagram 1 – Focus is on improving all aspects of human work, eg:
Therefore, the potential benefits that the discipline can bring to a wide-range of problematic issues in healthcare, including learning from events, are apparent.
Why do we need systems thinking?
The patient safety agenda really took off in 1999 with the publication of a landmark report by the US-based Institute of Medicine. The report was based upon analysis of multiple studies by a variety of organisations and concluded that between 44,000 to 98,000 people die each year as a result of preventable medical errors.
Today, it is estimated that around 10% of patients admitted to hospitals will be unintentionally harmed during their interaction with care services with around half of these cases judged to be preventable. The figures will differ for different hospital specialties.
However, in recent times, it has become clear that progress in terms of understanding and improving patient safety has been limited. A significant reason for this is that our approach to patient safety is seen by some to be limited, simplistic and out-of-date. The lack of understanding of systems thinking and systems approaches in healthcare, particularly in comparison with other high-risk industries, is a significant contributory factor in this limited progress.
The problem with “medical error”
Arguably, one of the reasons that our approach to learning about patient safety can be viewed as simplistic and out-of-date is our fixation with “medical error”.
For decades, the media have featured medical error, pilot error or human error as the main root cause of an incident. However, from a modern safety perspective this is misguided and does not offer a full explanation as it fails to recognise other system wide factors that are highly likely to have contributed to incidents. If you arrive at the conclusion of “error” as the cause of a safety incident, then this is the beginning of the learning process and not the end point. What this tells you is that the design of the care system is sub-optimal, rather than the individual concerned.
Diagram 2 depicts the many arguments where eminent scholars have pointed out the limitations in the use of ‘error’ as an explanation and cause. The key point is that people make errors all the time and they are largely inconsequential in terms of their impact. Professor James Reason, a prominent expert in the area, explains in his 12 Principles of Error Management that error is actually necessary as one of our main methods of learning as human beings.
Diagram 2 – the problem with medical error
Therefore, perhaps the best strategy is to focus on the safety event itself as the unit of analysis rather than becoming fixated on the notion of error, as this ultimately turns the spotlight on individuals and becomes unhelpful.
Systems thinking and root causes
We often read in the media and healthcare journals about the search for a ‘root cause’ or causes when something goes wrong. However, in complex care systems modern thinking strongly argues that there is no root cause of a patient safety incident or other organisational event. Things go wrong because of multiple, interacting, contributing factors from across the system, rather than as a result of the actions or behaviours of a single person.
There is a tendency in healthcare which fails to understand and treat our care systems as highly complex. Instead we revert to simplistic thinking and methods that limits our abilities to properly understand and improve our care systems.
Incident models – why things go wrong in complex care models
As healthcare has evolved clinically and grown exponentially in complexity, the reasons why things go wrong have also altered. The human factors based approaches have largely kept pace with this rising complexity, but arguably, these have not been fully translated into routine ” learning from events” policies or processes in healthcare.
Three stages theories of accident or incident analysis have developed over the decades:
1. Sequential theory and methods: based on the simplistic assumption that incidents are caused because of a combination of circumstances which interact in sequence and in a logical, linear manner. Incidents can therefore be prevented by removing one or more of the causes in the sequence of circumstances. Heinrich’s Domino Model is an example. This model is over a century old and in keeping with its time of origin. Yet its psychological appeal due to its simple logic means it is frequently observed to influence event thinking even today.
Diagram 3 – adapted from Heinrich’s domino model of accident causation
2. Epidemiological theory and methods: based on the assumption that incidents are caused in a linear fashion by the combined unsafe acts of frontline workers, eg, violations, forgetfulness and unseen latent hazards, eg, management decision-making, poorly designed technology or working environments, which reside within the system. By strengthening system barriers incidents can be prevented. Reason’s “Swiss Cheese Model” is an example.
Diagram 4 – adapted from Reason’s Swiss cheese model of accident causation
3. Sociotechnical systemic methods: this latest safety science thinking for complex systems assumes that incidents are the result of multiple, non-linear interactions. To adequately understand how and why incidents occur there is a need to comprehend these complex interactions and interrelationships across the care system to inform preventative measures. This is the “Systems Engineering Initiative for Patient Safety” (SEIPS) Model” which we will discuss further in modules 2 and 3 of this series.
Diagram 5 – adapted from the Chartered Institute of Ergonomics and Human Factors, SEIPS model
Attributes of complex healthcare systems
Instead of focusing simplistically on error, linear thinking and root causes, we need to try to appreciate the complexity of our care systems by acknowledging their attributes and exploring many different important issues that routinely impact on why things go wrong or cause everyday hassles and irritations.
Unlike other high risk industries, the design of healthcare systems and processes has largely evolved and emerged over time, rather than being purposely designed. This impacts on system complexity as change can happen rapidly and sometimes in an unpredictable manner. For example:
- Decisions are often made with imprecise information.
- Actions vary dependent on conditions to ensure successful and safe outcomes.
- Systems conditions are dynamic and can change rapidly – they are not always predictable.
- Demand may increase, eg, due to holidays or influx of emergency cases.
- Capacity can change, eg, due to staff absence or resource cuts.
- Wishes and health of patients can change, influencing our understanding and how we clinically manage.
- Limited purposeful design, work systems emerge and evolve over time.
In what ways is healthcare complex?
Healthcare is often complex and differs in this regard from many high risk industries. However, it is not always treated as such and simplistic assumptions may be made as to how clinical work should be undertaken. The study, 2016, illustrates the failure to recognise the complexity of clinical care systems.
Blood sampling incidents
Over the years, quality improvement and clinical audit attempts have tended to utilise linear type solutions that require nurses or doctors to remember to perform specific tasks in an orderly fashion to minimise the risks of safety incidents. We can then measure compliance with this bundle of tasks. This approach has failed to adequately understand or take account of the interacting and dynamic nature of the clinical work and ongoing situations that impact on this task. Therefore, we still see the recurrence of this issue as a patient safety problem.
When we study the design of the care system, multiple unresolved issues impact on blood sampling performance and there is a complexity of interactions and tasks at play.
Examples include:
- Some clinicians do not perform blood sampling because they prefer not to do it or do not think they are skilled at it.
- The printer to print patient labels lacks a supply of ink or labels, does not work properly or is situated in a different ward.
- There may be dexterity issues around accurately writing on the sample tube or that some pens do not work well with the type of label used.
All of these issues are minor everyday work hassles and irritations, but they can and do affect the performance of clinicians undertaking this task, something that is not necessarily considered when undertaking clinical audit or quality improvement activity related to this issue.
Build a safety culture
Importantly, as part of the systems approach, the safety of healthcare requires that organisations build and maintain a safety culture. The idea of safety culture is important as it has been shown to be a key predictor of safety performance in high risk industries. It is the difference between a safe organisation and an accident waiting to happen. Thinking and talking about our safety culture is essential for us to understand what we do well, and where we need to improve.
Shorrock S and Bowie P introduce a practical and meaningful approach to understanding safety culture in “Human Factors in Paramedic Practice”, Chapter 11 “Safety Culture: Theory and Practice” (2020).
Safety culture can be described as:
- Values or what is important.
- Behaviours or the way we do things around here.
- Beliefs or how we think things work.
Avoiding the blame game
When seeking to learn from incidents it is futile to enter into the “blame game”. There is often a deep psychological need to assign personal blame, whether to ourselves or others. However, in complex care systems this is misplaced; human contribution is often set up to fail because of sub-optimal system designs.
Openness and transparency around highlighting safety problems, committing to learn and avoiding the blame game are a vital part of systems thinking and analysis. We should strive to properly understand care systems and implement meaningful improvements to minimise healthcare risks and reduce the risks of incident recurrence to as low as reasonably practicable. They are also a key element in demonstrating professionalism and team working.
In summary
When considering safety issues in healthcare, the systems thinking perspective can be summarised as follows:
- Multiple, interacting factors from across the complex care system combine to contribute to why things go wrong.
- Safety is a shared responsibility of everyone because the web of contributory factors is created by the decisions and actions of people at various levels of the system, from politicians down to sharp-end care staff.
- We focus on the system as the unit of analysis when something goes wrong, we do not focus on individuals and seek to blame. We look “up and out” across the system and not “down and in”, as this leads to a shining of the spotlight on people.
- When reflecting on safety incidents, we should focus on:
- the trade-offs that people had to make, for example between being ultra-safe and having to be efficient;
- the demands they faced and the resources they had available;
- the context and situation they faced;
- and why decisions and actions made sense to them at the time, otherwise they wouldn’t have made them.
PMP hopes that this first module in the Systems Thinking in Learning from Events webinar series has provided you with some high level but helpful insights and information.
This topic is explored further in the second module of the series, where we will discuss the ‘Systems Approach’ by learning more about the “Systems Engineering Initiative for Patient Safety” (SEIPS) Model”. In the third and final module of the series, we will discuss how systems thinking principles can be embedded in traditional “Mortality and Morbidity” review meetings.
Information correct at time of publication July 2023
This document does not constitute legal or medical advice and should not be construed as rules or establishing a standard of care. We recommend that you seek independent legal and/or professional advice in relation to your legal or medical obligations or rights. Premium Medical Protection Limited is the owner of this material and its contents are protected by copyright law © 2023. All such rights are reserved.
For more information regarding the hyperlinks referenced in this document, click here