Safety does not reside in a person, device or department, but emerges from the interactions of components of a system.
Institute of Medicine, 1999
To Err is Human: Building a Safer Health System
Modern systems approaches to reduce errors and improve efficiency have their roots in the manufacturing quality and process control principles developed by renown statistician W. Edwards Deming, engineers Joseph M. Juran and Kaoru Ishikawa, and former Secretary of Commerce and quality management champion Malcolm Baldrige, among others.
When a manufacturing process is standardized, it often leads to greater efficiency and fewer mistakes. It’s no wonder than some of these same processes (systems) for finding, analyzing, and preventing manufacturing errors are being applied to healthcare.
An important contributor to medical errors is lack of communication between co-workers, departments, shifts, even among different organizations and levels of care. Many doctors, nurses, and other healthcare professionals see a particular patient for different aspects of the patient’s care. This makes creating a culture of safety a huge organizational challenge, one that needs to be evaluated constantly and systematically. According to the IOM, most medical errors are the result of systems failures that require analysis on a systems level to understand their cause and to promote corrective action.
Indeed, Garrouste-Orgeas and colleagues (2012) wrote,
Preventive strategies are more likely to be effective if they rely on a system-based approach, in which organizational flaws are remedied, rather than a human-based approach of encouraging people not to make errors.
Root cause analysis (RCA) is a systems approach that asks three questions that provide the framework for information collection?
According to the book Internal Bleeding, “RCA attempts to write a second story about the actions that led to error—to look past the obvious. . . scapegoats and find the other culprits, however deeply they may be embedded in the system.” (Wachter & Shojania, 2004).
In 1997 the Joint Commission (then called the Joint Commission on the Accreditation of Healthcare Organizations, or JCAHO) mandated the use of root cause analysis in the investigation of sentinel events or medical errors in accredited hospitals. There are two main categories of error:
Root cause analysis is used to identify trends and assess risk when human error is suspected, with the understanding that systemic factors, rather than individual factors, are likely the root cause of the problem. The goal is to avoid a culture of blame. Systematic application of RCA can uncover root causes that link varied accidents such as a variety of serious adverse events occurring at shift change. Careful analysis may suggest system changes designed to prevent future incidents (Hughes & Blegen, 2008).
When a sentinel event has been identified for analysis, a multidisciplinary team is assembled to direct the investigation. The team members must be trained in the techniques of RCA because the tendency to revert to personal bias is strong. Multiple investigators allow for comparison and corroboration of major findings and increase the validity of final results (Hughes & Blegen, 2008).
Accident analysis is generally broken down into the following steps:
At the conclusion of the RCA, the team summarizes the underlying causes and their relative contributions, and begins to identify administrative and systems problems that might be candidates for redesign (Hughes & Blegen, 2008).
Another systems approach to eliminating medical errors is called Plan-Do-Study-Act approach (PDSA), devised by the Institute for Healthcare Improvement. This strategy has been widely used by the Institute and many other healthcare organizations. One of the unique features of this strategy is the acknowledgement that change is cyclical in nature and benefits from small and frequent PDSAs rather than big and slow ones, before changes are made system-wide (IHI, 2011).
The PDSA cycle tests a change by “developing a plan to test the change (Plan), then carrying out the test (Do), observing and learning from the consequences (Study), and determining what modifications should be made to the test (Act)” (IHI, 2011).
Another systems approach to the problem of medical errors is the Team Strategies and Tools to Enhance Performance and Patient Safety (TeamSTEPPS) approach. A key point is that, even though the delivery of care requires teamwork, members of these teams are rarely trained together and they often come from separate disciplines and diverse educational programs (King et al., 2008).
Given the interdisciplinary nature of healthcare and the necessity for cooperation among those who perform it, teamwork is critical to ensure patient safety. Teams make fewer mistakes than individuals, especially when each team member knows his or her responsibilities. Simply conducting training or installing a team structure does not ensure that the team will operate effectively (King, et al., 2008).
There are three phases to the TeamSTEPPS approach:
Lean Six Sigma is the combination of two methodologies, Lean and Six Sigma. Lean attempts to eliminate waste within a process, and Six Sigma (named for six standard deviations from the mean — three above and three below) attempts to reduce variation and defects (AHRQ HIT, 2008).
Central to Lean Six Sigma is the Define, Measure, Analyze, Improve, and Control (DMAIC) lifecycle (see below).
Source: Meliones, Alton, & Mericle, et al., 2008.
Lean Six Sigma is the gold standard manufacturing system for many Fortune 500 companies; however, it is a large, complicated process requiring extensive training to implement.
Nevertheless, Lean Six Sigma can be used to decrease medical errors and improve outcomes. In one example, North Mississippi Medical Center reduced the number of prescription instruction errors in discharge documents by 50% using Lean Six Sigma, according to American Society for Quality (ASQ, n.d.). In fact, Six Sigma programs incorporating some Lean principles were shown to have positive results on:
Published March 28, 2012