Skip to main content

 

Politics Are Key Factor in Policy Progress

As we approach the culmination of the biannual event known as “the most important election of our lifetime,” it is an opportune moment to assess what this election has in store with regard to the medical professional liability community.

Mega Verdicts: Managing Jurors' Changing Attitudes

Members and partners are invited to join our April 2 webinar (2:00 p.m. ET): Verdicts that vastly exceed the case’s evidence or assessed value have spiked. Appellate law expert Derek Stikeleather provides some fascinating insights into these phenomena and offers advice on how defense counsel can better protect their healthcare clients.

MPL Association Announces Cooperative Agreement with APCIA

The MPL Association is pleased to announce a new cooperative agreement between the Association and the American Property Casualty Insurance Association to enhance both entities’ government relations efforts. Read more!

 

FEATURE

Top 10 Patient Safety Concerns



The newest report from ECRI and the Institute for Safe Medications Practices (ISMP), “Top 10 Patient Safety Concerns 2025,” highlights the 10 most pressing patient safety challenges facing the healthcare industry in 2025.

Leveraging ECRI and ISMP’s data-driven research and expert insight, the report discusses critical areas that healthcare leaders should consider as opportunities to minimize preventable harm. Some are emerging issues, while others are persistent yet unresolved. However, all represent areas where impactful change is possible.

This list serves as a strategic guide for implementing proactive, system-wide solutions aimed at reducing risk and improving patient outcomes across the healthcare spectrum.

The 2025 Top 10: A New Era of Patient Safety

As healthcare advances at an unprecedented pace, the landscape of patient safety is continually evolving. The year 2025 marks a pivotal moment in this ongoing journey, as we are now a quarter of a century removed from the Institute of Medicine’s landmark report, To Err is Human.

We are currently facing challenges that seemed futuristic and improbable in 1999—the integration of artificial intelligence in a clinical setting, the growing threat of cyberattacks on health data, and the viral spread of medical misinformation on social media platforms. Our society has also become more conscious of widening health disparities, and a new movement is giving voice to those who have been “medically gaslit.”

And yet, we are still grappling with challenges that have plagued healthcare teams for years, such as missed diagnoses and healthcare-associated infections.

This new era of patient safety requires heightened vigilance, new and adaptive strategies, and a commitment to fostering a culture of safety with health-literate practices that ensure the well-being of patients in an increasingly digital, complex, and interconnected world.

Not all topics on the list will apply to all healthcare facilities and, of course, not all possible patient safety concerns made the Top 10; rather, experts determined that the topics listed here should receive greater attention and consideration in 2025.

Further, the omission of a topic that was included in a previous year’s list should not be interpreted to mean that the topic no longer deserves attention. Many of those concerns still persist, and healthcare organizations should continue taking action to minimize them.

Method for Selecting the List

The list reflects ECRI and ISMP’s broad patient safety and risk management expertise. The interdisciplinary staff includes experts in medicine, nursing, pharmacy, patient safety, quality, risk management, clinical evidence assessment, health technology, and many other fields.

This article highlights three areas of most concern in medical professional liability. Please note that these are not the first three selected by ECRI; the complete list and full report is available on the ECRI website.

A Total Systems Approach to Safety

ECRI’s Total Systems Approach to Safety moves organizations away from reactive, disconnected interventions by codesigning and implementing a holistic, proactive, and sustainable safety system that achieves better results.

Total Systems Approach to Safety aligns leadership, governance, and cultural priorities with workforce safety and wellness, along with patient and family engagement. By redesigning safety system elements, healthcare providers can deliver care more reliably and resiliently. Rooted in advanced safety science, clinically informed human factors engineering, just culture, and health equity, Total Systems Approach to Safety aims to prevent error, reduce harm, improve staff well-being, and enhance overall care quality.

Prioritizing Strategies, Taking Action, and Measuring Improvement

No organization can tackle all 10 items immediately. Organizations must calculate each item’s risk score and conduct a gap analysis to evaluate their current practices against the report’s recommendations.

To address each concern in this year’s list, readers can consider the action recommendations, which are framed around the four foundational drivers of safety—culture, leadership, and governance; patient and family engagement; workforce safety and wellness; and learning system. These evidence-based recommendations were developed by ECRI and ISMP’s analysis from a wide range of data sources, offering strategies to support continuous improvement in healthcare. They also illustrate how systems can contribute to harm—or drive patient safety.

Healthcare leaders must be intentional about implementing solutions in their complex, unique organizations. Superficial attempts will not be enough to make meaningful changes in improving patient safety. Before implementing changes, leaders must establish systems and processes for measuring and analyzing improvements, and they should be ready to modify or discontinue specific strategies based on the results analysis.

Safety concerns can have clinical, cultural, efficiency, and financial impacts on an organization. Measuring the results of changes should be multimodal—with structural-, process-, and outcomes-related metrics. Sources of data may include event reports; medication-safety data; survey results, including results from culture of safety, employee-satisfaction, and patient-experience surveys; morbidity and mortality data; length-of-stay statistics; focus group discussions; and direct observation data. In addition, organizations should segment data to better understand inequities that may create disparities in both patient and workforce outcomes.

Patient Safety Concern #1: Insufficient Governance of Artificial Intelligence in Healthcare

Although artificial intelligence (AI) has been present in healthcare for years, AI is being incorporated into an ever-growing array of healthcare applications, including imaging applications, clinical decision-making support tools, medical notes generation, and scheduling tools.

AI applications have many potential benefits, including improved clinical outcomes, reduced costs, and reduced healthcare worker burnout. However, common issues with AI technology—such as bias, transparency, and privacy and security concerns—can have unique and dangerous consequences in healthcare.

AI models are only as good as the algorithms they use and the data on which they are trained. When AI models are based on bad data, they can increase the chances of an adverse event.

Medical errors generated by AI could compromise patient safety and lead to misdiagnoses and inappropriate treatment decisions, which can cause injury or death. Staff may have difficulty determining when events are attributable to AI, making such errors harder to track.

Despite its widespread application and potential risks, few healthcare organizations have policies governing the use of AI. A 2023 survey of 31 hospital executives found that only 16% reported that their organization had a system-wide governance policy for AI usage and data access.

Reports have found that certain AI models demonstrate bias related to race, socioeconomic status, gender, or sexual orientation, which could exacerbate healthcare disparities. Despite this, one study found that academic medical centers inconsistently consider factors like inequality, racism, or bias in AI governance policies.

Both patients and healthcare workers have expressed concerns over the adoption of AI in healthcare.

In a December 2022 survey of 11,004 US adults, 60% said they would feel uncomfortable with their provider relying on AI for their medical care, and 75% were concerned that providers will adopt AI too fast.

A 2024 survey of more than 2,300 registered nurses found that 60% disagreed with the statement, “I trust my employer will implement AI with patient safety as the first priority.”

Failure to develop system-wide governance to evaluate, oversee, and monitor new and current AI applications may increase healthcare organizations’ liability risks. However, it can be challenging to establish policies that can adapt to rapidly changing AI technology.

Culture, Leadership, and Governance Action Recommendations:

  • Establish policies, define processes, and assign responsibilities for the governance, implementation, oversight, and monitoring of AI solutions.
  • Form a multidisciplinary committee to evaluate new technologies that incorporate AI and determine risks; include representatives from leadership, clinical services, human factors engineering, clinical engineering, patient safety, and risk management.
  • Ensure that organizational policies on the use of AI in medical technologies align with federal, state, and local laws and regulations. Monitor development in these areas and update policies as needed.
  • Train staff on the organization’s AI usage policy, including which AI applications are approved and prohibited for job-related activities, and whom to ask should confusion arise.
  • Regularly assess safety and clinical outcomes related to practices impacted by AI and any effects the AI solution may have on healthcare disparities.

Patient and Family Engagement Action Recommendations:

  • Disclose the use of AI to patients and obtain their informed consent beforehand if the organization uses generative AI (e.g., notetaking, guiding clinical decision-making) or if it uploads patient clinical data to an AI system (e.g., to help with diagnosis).
  • Solicit feedback from patients if the organization uses AI in patient-facing applications to determine whether the system is easy to use and meets their specific needs.
  • Engage patient and family advisory councils in the design of appropriate messaging and communication aimed at educating patients in the use of generative AI.

Workforce Safety and Wellness Action Recommendations:

  • Ensure that the organization performs human-factors-based assessments of clinical workflows when new technologies that incorporate AI are implemented to determine potential impacts.
  • Regularly assess the user experience of staff related to AI applications.
  • Take staff concerns related to the operation and use of AI applications seriously and take steps to investigate and address them.

Learning System Action Recommendations:

  • Implement a robust reporting system for AI-related medical incidents, errors, and adverse events, and/or review the organization’s current incident reporting system to ensure that it can capture AI-associated concerns.
  • Emphasize to staff that AI is a tool, and that they should defer to their own clinical judgment and seek second opinions when questioning clinical decisions or diagnoses aided by AI.
  • Educate staff on how to identify incidents, errors, or adverse events that can be attributed to AI functionality, including those related to privacy, accuracy, misdiagnosis, and potential bias. Encourage staff to report such errors as they would any other anomaly.

Patient Safety Concern #2: Medical Error and Delay in Care Resulting from Cybersecurity Breaches

Cybersecurity has become one of the most pervasive and persistent concerns in healthcare. In a survey of healthcare cybersecurity professionals, 88% reported that their organizations experienced cyberattacks in the past year, with an average of 40 attacks per organization.



In 2023, the US healthcare industry experienced 725 large security breaches, affecting more than 133 million medical records. Such breaches have increased nearly every year since 2009 and have continued through 2024, causing disruptions in patient care, delaying diagnostic testing results, and creating supply chain issues.

Cyberattacks can be costly, and healthcare remains the costliest industry for breaches. In the first half of 2024, the average cost for healthcare was $9.77 million per breach in the United States, and $4.88 million per breach globally. Cybersecurity breaches can cause widespread disruptions and have broad and devastating effects that impact patients, providers, healthcare organizations, and the surrounding community:

Culture, Leadership, and Governance Action Recommendations:

  • Devote adequate time and resources to cybersecurity concerns and build cybersecurity into organizational policies.
  • Use resources such as those highlighted in the National Institute of Standards and Technology (NIST) Cybersecurity Framework 2.0 to develop your organization’s governance over cybersecurity matters. Ensure that these policies and procedures include clearly defined roles and responsibilities.
  • Monitor compliance with cybersecurity policies and procedures.
  • Take an enterprise risk management approach to help the organization achieve a comprehensive understanding of cybersecurity risks, using guidance from NIST, Joint Commission, and the US Department of Health and Human Services (HHS) for small healthcare organizations and medium/large organizations.
  • Include cyberattack response in the organization’s emergency preparedness plan and collaborate with local partners and healthcare organizations to ensure that there are established strategies for combatting cyberattacks. Refer to resources from FEMA.

Patient and Family Engagement Action Recommendations

  • Emphasize the importance of cybersecurity precautions with patients and families, especially those accessing the healthcare network or using healthcare-related apps (e.g., telehealth, patient portals). Tips and discussion topics are available from HHS.

Workforce Safety and Wellness Action Recommendations

  • Ensure that staff training on cybersecurity is effective, meets regulatory requirements, and reaches all appropriate audiences. Emphasize that such precautions protect not only the organization, but also patients and staff.
  • In the event of a breach, ensure staff have adequate resources to deliver patient care while systems are down. This may include enacting certain emergency response protocols (e.g., manual charting, surge protocols).
  • Ensure rotation and rest for staff when responding to a cyberattack, as long hours and extreme stress can result in poor decision-making. Consider offering mental health resources and counseling to affected employees.

Learning System Action Recommendations

  • Regularly assess cybersecurity risks and the organization’s adherence to cybersecurity best practices. This includes reviewing records to track access to data and detect security incidents; periodically evaluating the effectiveness of existing security measures; and regularly evaluating threats and vulnerabilities.
  • Practice responding to cybersecurity incidents to test the effectiveness of your organization’s incidence response plan. Conduct drills using tabletop exercises, such as the Cybersecurity & Infrastructure Security Agency’s Tabletop Exercises Packages. Involve representatives from, at a minimum, information technology, clinical engineering, and risk management.

Patient Safety Concern #3: Diagnostic Error: The Big Three—Cancers, Major Vascular Events, and Infections

A diagnostic error, or misdiagnosis, can mean the difference between life and death for a patient, potentially causing delayed or improper treatment that can lead to permanent injury or loss of life. Diagnostic error represents a major public health problem and is recognized as a major source of preventable harm in US healthcare.

Each year in the United States, approximately 12 million adults experience diagnostic errors—about half of which may cause serious patient harm—and an estimated 795,000 Americans die or are permanently disabled due to misdiagnosis of dangerous diseases.

Research suggests that the most serious errors may be attributed to a surprisingly small number of conditions that are categorized as the “Big Three.” Misdiagnosis of cancers (37.8%), vascular events (22.8%), and infections (13.5%) account for the majority of high-severity harm, based on closed malpractice claims.

The most common conditions in each category are lung cancer, stroke, and sepsis, respectively. Other commonly misdiagnosed conditions include breast, colorectal, prostate, and skin cancers; heart attack; aortic aneurysm and dissection; meningitis and encephalitis, pneumonia; and endocarditis.

In primary care, missed cancer diagnosis represented 46% of claims, with the majority (76%) involving “errors in clinical judgment, such as a failure or delay in ordering a diagnostic test (51%) or failure or delay in obtaining a consult or referral (37%).” However, in the emergency department, “major vascular events (42%) and infections (23%) substantially outnumber cancers (8%).”

Diagnostic errors result from a wide array of system factors, from cognitive errors and clinical assessment (e.g., inadequate knowledge and experience, lack of competency, poor critical thinking and clinical decision-making skills, problems in data gathering, failing to synthesize and communicate information) to breakdowns in diagnostic process (e.g., failure to order, collect, and process diagnostic and laboratory tests; inadequate medical history and physical examination; failure to manage referrals and follow up with the patient).

Healthcare organizations and clinicians should focus their resources and interventions on improving the specific process and system vulnerabilities contributing to these diagnostic errors, and diseases associated with high diagnostic error rates should become targets for developing, implementing, and scaling systematic solutions.

Culture, Leadership, and Governance Action Recommendations

  • Assemble a multidisciplinary team accountable to senior leadership to promote diagnostic safety and quality.
  • Enhance access to diagnostic testing and services, including improved access to specialists and assistance with scheduling diagnostic testing before patients leave the office.
  • Ensure that providers adhere to current cancer screening and diagnostic guidelines from the US Preventative Services Task Force, the American Cancer Society, and other professional organizations.
  • Utilize clinically informed human factors engineering to analyze complex work systems that are at higher risk for diagnostic errors (e.g., laboratory testing).
  • Review diagnostic uncertainty or discrepancies at all handoffs and transitions in care, especially for complex cases.
  • Identify patients who face systemic social and health inequities who may be at higher risk for misdiagnosis.
  • Implement closed-loop processes for diagnostic test tracking, follow-up, and notification of results.

Patient and Family Engagement Action Recommendations

  • Assess all patients’ health literacy and ensure a full medical history is conducted, including family history of cancer or cardiovascular disease, and document this information in the patient’s medical record.
  • Empower patients and families to partner with providers in their care, prepare for visits, understand their diagnosis, and ask questions.
  • Provide patients with access to patient portals for electronic health records, including visit notes and diagnostic test results, to enhance communication and continuity of care.
  • Institute a communication and resolution program to transparently communicate, apologize, and offer resolution to patients and families when they experience an unanticipated outcome due to a diagnostic error.
  • Foster a safety culture that embraces nonpunitive reporting for laboratory system issues and errors that may be discovered during cancer screening orders and referrals.
  • Institute a peer support program (i.e., care for the caregiver) for clinicians involved in patient harm resulting from a diagnostic error.

Learning System Action Recommendations

  • Conduct diagnostic safety risk assessments, including recognition of common systems with broad differential diagnosis that may be susceptible to cognitive biases, certain laboratory and imaging studies, and other tests that are prone to misinterpretation or missed follow-up.
  • Utilize checklists and diagnostic timeouts.
  • Perform root cause analyses on all adverse events that result in significant harm or death related to a diagnostic error.
  • Encourage collaboration and debriefing among clinical staff to question assumptions, verify diagnosis, and solicit feedback from each other.
  • Provide training and education for all clinicians (e.g., nurses, pharmacists, allied health professionals) including the use of simulations to improve decision-making.
  • Implement quality improvement activities that focus on prompt diagnosis of cancers, vascular events, and infections, and measure key performance indicators to assess the impact of improvement efforts.

Access the entire report, Top 10 Patient Safety Concerns 2025, on the ECRI website.

ECRI’s Total Systems Approach to Safety moves organizations away from reactive, disconnected interventions by codesigning and implementing a holistic, proactive, and sustainable safety system that achieves better results.