Historically, medical errors were viewed mainly as a problem of bad healthcare providers. This led to blaming and shaming individual providers. That approach did not improve patient safety much.

For decades, researchers in medicine have studies the causes of medical errors.

Nowadays, we understand that medical errors usually come from a combination of (a) human errors within (b) a flawed system or culture.

We also know that some amount of human error is inevitable. So the key to improving patient safety is to create systems that reduce the likelihood of human error in the first place and to create checks that make it less likely for human error to hurt a patient when error does occur.

This requires identifying the root causes of medical error. It requires identifying weaknesses or vulnerabilities in systems. It also requires creating and maintaining a culture of safety — a culture in which people are committed to constantly improving safety; a culture in which errors and near-misses are reported so that they can be avoided in the future.

Source support

The patient-safety sources collected for this piece support the same basic frame: medical error should not be treated only as a question of individual blame. The more useful question is often what system allowed preventable danger to reach the patient.

Medical error is not mainly a bad-apples problem

"Yet we now understand that the problem of medical errors is not fundamentally one of “bad apples” (though there are some), but rather one of competent providers working in a chaotic system that has not prioritized safety. ... Most errors are made by good but fallible people working in dysfunctional systems, which means that making care safer depends on buttressing the system to prevent or catch the inevitable lapses of mortals. This logical approach is common in other complex, high-tech industries, but it has been woefully ignored in medicine. Instead, we have steadfastly clung to the view that an error is a moral failure by an individual, a posture that has left patients feeling angry and ready to blame, and providers feeling guilty and demoralized."

Robert Wachter's Understanding Patient Safety frames modern patient-safety thinking as a move away from treating errors as proof of bad character. The broader passage explains that competent providers can still harm patients when they work inside chaotic systems that have not prioritized safety.

Citation: Robert M. Wachter, Understanding Patient Safety, 3rd ed. (New York: McGraw-Hill Education, 2018), Preface; quoting Robert M. Wachter and Kaveh G. Shojania, Internal Bleeding: The Truth Behind America's Terrifying Epidemic of Medical Mistakes (2004).

Safety improvement requires more than one solution

"A comprehensive approach to improving patient safety is needed. This approach cannot focus on a single solution since there is no "magic bullet" that will solve this problem, and indeed, no single recommendation in this report should be considered as the answer. Rather, large, complex problems require thoughtful, multifaceted responses. The combined goal of the recommendations is for the external environment to create sufficient pressure to make errors costly to health care organizations and providers, so they are compelled to take action to improve safety."

The Institute of Medicine's To Err Is Human describes patient safety as a systems problem that requires pressure, incentives, learning, and institutional change. That supports a view of malpractice accountability as one possible part of a broader safety environment, not as a substitute for good medicine.

Citation: Institute of Medicine, Committee on Quality of Health Care in America, To Err Is Human: Building a Safer Health System, ed. Linda T. Kohn, Janet M. Corrigan, and Molla S. Donaldson (Washington, DC: National Academy Press, 2000), Executive Summary.

Leadership is responsible for safety culture

"In any health care organization, leadership’s first priority is to be accountable for effective care while protecting the safety of patients, employees, and visitors. Competent and thoughtful leaders* contribute to improvements in safety and organizational culture. They understand that systemic flaws exist and each step in a care process has the potential for failure simply because humans make mistakes. James Reason compared these flaws – latent hazards and weaknesses – to holes in Swiss cheese. These latent hazards and weaknesses must be identified and solutions found to prevent errors from reaching the patient and causing harm. Examples of latent hazards and weaknesses include poor design, lack of supervision, and manufacturing or maintenance defects."

"Inadequate leadership can contribute to adverse events in various ways, including but not limited to these examples: • Insufficient support of patient safety event reporting, • Lack of feedback or response to staff and others who report safety vulnerabilities, • Allowing intimidation of staff who report events, • Refusing to consistently prioritize and implement safety recommendations, • Not addressing staff burnout."

"In essence, a leader who is committed to prioritizing and making patient safety visible through every day actions is a critical part of creating a true culture of safety. Leaders must commit to creating and maintaining a culture of safety; this commitment is just as critical as the time and resources devoted to revenue and financial stability, system integration, and productivity. Maintaining a safety culture requires leaders to consistently and visibly support and promote everyday safety measures. Culture is a product of what is done on a consistent daily basis. Hospital team members measure an organization’s commitment to culture by what leaders do, rather than what they say should be done."

The Joint Commission's safety-culture guidance identifies leadership failures that can contribute to adverse events, including weak support for safety reporting, lack of feedback to staff who report vulnerabilities, intimidation of staff who report events, failure to prioritize safety recommendations, and failure to address burnout.

Citation: The Joint Commission, "The Essential Role of Leadership in Developing a Safety Culture," Sentinel Event Alert, Issue 57 (March 1, 2017; revised June 18, 2021).

Honesty after harm is part of medical ethics

"Patients have a right to know their past and present medical status, including conditions that may have resulted from medical error. ... Concern regarding legal liability should not affect the physician’s honesty with the patient."

"[I]ndividual physicians who have been involved in a (possible) medical error should: (a) Disclose the occurrence of the error, explain the nature of the (potential) harm, and provide the information needed to enable the patient to make informed decisions about future medical care. (b) Acknowledge the error and express professional and compassionate concern toward patients who have been harmed in the context of health care. (c) Explain efforts that are being taken to prevent similar occurrences in the future."

The American Medical Association's patient-safety ethics guidance recognizes that patients have a right to know about conditions that may have resulted from medical error. It calls for disclosure, acknowledgment, compassionate concern, continuity of care, and explanation of efforts being taken to prevent similar occurrences in the future.

Citation: American Medical Association, Code of Medical Ethics (Chicago: American Medical Association, 2017), ch. 8, Opinion 8.6, "Promoting Patient Safety."

Why this matters in malpractice review

This does not mean every bad outcome is malpractice. It does not mean every mistake supports a lawsuit. It does not mean individual choices never matter. It means that responsible malpractice review should ask a better question than who can be blamed most quickly.

The better question is how the danger reached the patient. Was there a clear handoff? Was a critical test result communicated? Was there a reliable escalation path? Were nurses, physicians, technicians, and administrators working in a culture where safety concerns could be raised and acted on? Were known risks treated as operational problems to be fixed, or as background noise until someone was seriously hurt?

Good clinicians need good systems. When those systems fail, malpractice analysis should not stop with the last person in the room. It should ask whether the institution made preventable harm more likely than it should have been.