Skip to main content

 

Politics Are Key Factor in Policy Progress

As we approach the culmination of the biannual event known as “the most important election of our lifetime,” it is an opportune moment to assess what this election has in store with regard to the medical professional liability community.

Status Quo or Radical Change for MPL? The Results of the 2024 Election

Access the webinar on-demand.

MPL industry government relations experts offer a whirlwind tour of the 2024 election results and what that may mean for MPL stakeholders.

 

 

FEATURE

Applying Human Factors Engineering to Increase Healthcare Safety


By Patrice D. Tremoulet, PhD


Modern healthcare relies on a variety of physical and digital assets and tools. Sophisticated medical devices help in diagnosing and treating disease. Advanced materials prevent the spread of bacteria. Diverse physical environments are used to deliver care and promote healing.

As valuable as these powerful resources are, they also increase the complexity of healthcare. Meanwhile, the healthcare industry hasn’t adequately addressed the impact of technology in risk assessments. Moreover, it has built inefficient clinical workflows that drive clinicians to create unsafe workarounds, while also establishing intricate processes that invite human error. Such human error can lead to patient harm, which can then give rise to medical professional liability (MPL) claims.

Human factors engineers specialize in understanding how complex work systems create opportunities for human error. Their mission is to design systems that support people in performing the work they need to accomplish. They view human error as an opportunity find system weaknesses, with the goal of improving a work system’s safety, efficiency, and effectiveness. Thus, human factors engineering, the subject of this article, can be a powerful tool in mitigating human error and subsequent MPL claims.

A Systems Perspective

Safety science research demonstrates that the most effective, safe, and sustainable system designs consider all components of a work system. Human factors engineers view healthcare as a complex system with five internal components surrounded by an external environment. Internal components include:

  • People
  • Tasks and processes
  • Tools and technology
  • Physical environment
  • Supporting organizational structure

The external environment introduces challenges and constraints that cannot be controlled but must be accounted for.

Problems in one part of a healthcare system, such as an unintuitive user interface, or a poorly organized stock room, not only cause errors themselves, they may also interact to create hazards in other parts of the system. And, if not addressed, problems in any part of the system can cause harm to patients or clinicians.

When you view healthcare as a system, it becomes clear that problems are often more complicated than they first appear. But that means there are more ways to solve those problems than may be apparent at first.

Every healthcare worker is an extremely important part of how well the healthcare system works, and each person has the power to affect safety and quality in their facility.

The following case study illustrates how the interplay of people and internal and external system components can inadvertently cause patient harm.

The ECRI human factors engineering (HFE) team helped a large health system in the US tackle a persistent challenge in the delivery of care: identifying and treating sepsis.

Sepsis infections are closely tied to MPL, as delays in diagnosis or inadequate treatment can lead to severe patient harm or death, increasing the risk of legal action against healthcare providers.

Case Study in System Safety: Enabling Early Sepsis Treatment

The healthcare system’s solution for reducing sepsis mortality was comprised of a sepsis best practice alert, an electronic medical record tab called Sepsis Navigator, and a Sepsis Workflow. The sepsis navigator facilitated initiation sepsis treatment, enabling front line staff to start a sepsis timer and request that fluids and antibiotics be ordered from within the same screen. The new sepsis workflow stipulated that upon being presented with the sepsis best practice advisory alert, front line nurses should immediately consult with a physician to determine if sepis treatment should begin. The sepsis best practice alert was designed to notify staff when patient vitals met specific criteria, and the sepsis workflow called for nurses to immediately consult with a physician when the best practice alert was presented. If the physician determined that sepsis was a concern during the consult, nurses were expected to click on a link in the best practice alert to bring up the sepsis navigator to guide them in beginning treatment.

When electronic medical record data revealed that nurses were not using the sepsis navigator as expected, the informatics team assumed that usability issues with the sepsis navigator and/or with the sepsis best practice alert were to blame. A closer look by ECRI’s human factors engineering team revealed that the sepsis workflow, which the best practice alert and sepsis navigator were designed to support, was also problematic.

Experienced nurses in emergency departments could tell when a diagnosis other than sepsis was more likely to be causing the vitals that triggered the best practice alert, but the new sepsis workflow required them to consult with a physician right away. These nurses found the best practice alert’s hard stop very disruptive. Similarly, nurses working in inpatient units found that it could be difficult to quickly reach the on-call physician when the best practice alert went off at night. Nurses in both settings learned that they could continue with their interrupted tasks by snoozing or dismissing the best practice alert. As a result, nurses frequently dismissed the best practice alert, and many did so even when it successfully induced them to consult with a physician.



In short, the new sepsis workflow was inefficient because it called for nurses to wait until they consulted with a physician before responding to the best practice alert. Front-line nurses perceived this to be due to a problem with the best practice alert, since it disrupted their work. Moreover, by dismissing the disruptive best practice alert, nurses lost the link inside it that could be used to open the sepsis navigator, so even when the best practice alert did enable earlier sepsis detection, the sepsis navigator was typically not being used to facilitate treatment.

What else was going on? For one thing, nurses were very busy—as nurses usually are—so they were trying to work quickly. The nurses being very busy is both an organizational and an external environment problem, because there was a shortage of nurses, which caused the hospitals to be short-staffed. In addition, some units had long hallways and the larger emergency departments had many rows of treatment areas. In those settings, the physical environment contributed to the disruption, by making it challenging for nurses to quickly consult with the physician. Not including any nurses from the busiest emergency departments during solution development was an organizational problem. Leadership allowing the informatics team to deliver a tool with a very high false alarm rate, based upon a worthy goal of detecting every case of sepsis, was also an organizational problem. Finally, there was a significant interaction between the best practice alert’s high false alarm rate and its hard stop; these two issues together greatly increased stress and workload for front line nurses.

That is a long list of issues, but it is not exhaustive. If the ECRI human factors engineering team had focused solely upon making recommendations to improve usability of the sepsis best practice alert and the sepsis navigator, front line staff could have burned out or become conditioned to automatically dismiss the best practice alert, particularly when busy juggling other patient needs. Burnout and automatic dismissal would reduce effectiveness of the best practice alert, thereby hampering the goal of reducing sepsis mortality in the long term. Taking a systems approach allowed the human factors engineering team to provide actionable recommendations for modifying their solution which, when implemented, enabled the hospital system’s staff to maintain a significantly lower rate of sepsis mortality across several years.

Designing with Humans at the Center

The sepsis mortality reduction effort illustrates the complexities that shape healthcare system safety. By examining the unexpected lack of usage of the sepsis navigator from a system perspective, many issues that needed to be addressed to more effectively boost patient safety were identified.

Indeed, poor system design is healthcare’s most critical safety challenge. Healthcare takes place in complex sociotechnical environments that stress clinicians, leading to burnout and medical errors. Taking a system perspective enables us to:

  • Gain an accurate understanding of how things are working
  • Identify contributing factors during data collection
  • Craft an expanded solution set
  • Achieve effective and sustainable outcomes
For an organization to achieve a goal of total system safety—meaning that the organization enables people to work efficiently and effectively and be able to easily detect and prevent potential harm—it is necessary to design the system to support those safety and reliability goals.

 


Patrice D. Tremoulet, PhD, is Director, Human Factors Engineering at ECRI.