Just What Does Culture Have to Do With Patient Safety?

A Conversation With David Marx

Barbara L. Olson MS, RN-BC, FISMP


January 11, 2010


For the past 8 months, I've written a weekly Medscape Nurses blog, On Your Meds: Straight Talk about Medication Safety . It's a forum that invites reflection and analysis, a place where frontline nurses, educators, and administrators share experiences and concerns. I've noticed that the question of how to deal with clinicians who make mistakes is something nurses find interesting. And at times, answers to this question divide us.

In the past decade, we've seen the pendulum swing away from blaming individuals and near-certain punitive action and move instead toward system-based inquiries, processes that seek to identify root causes in hopes of building safer systems. But the perception that the "no blame" approach means individuals are not accountable for poor decisions or bad actions hasn't resonated with key stakeholders, including many nurses and nurse leaders. As organizations move away from the "no blame" approach to error analysis, it's important to define, with some certainty, what we're moving toward.

David Marx, Safety Expert

I thought it would be interesting to explore these issues with David Marx, one of the nation's foremost patient safety leaders and the individual who first coined the term "just culture" to describe the relationship between accountability and safety.

Marx knows a lot about how culture influences safety and the roles leaders play in shaping culture. He has a BS in Mechanical Systems Engineering and a Juris Doctor in Law. Beginning his career as a Boeing aircraft design analyst, Marx conducted failure modes and effects analysis and probabilistic risk assessments on Boeing aircraft. In his final years at Boeing, he organized a human factors and safety group that developed a human error investigation process used by air carriers around the world. His firm, Outcome Engineering, now spends 80% of its time helping healthcare providers, organizations, and regulators understand modern risk management methodologies. I first encountered Marx's thoughts about human error and the just treatment of people involved in them when I was the Safe Medication Management fellow at the Institute for Safe Medication Practices (ISMP), a federally certified patient safety organization.

The Just Culture Algorithm™

Marx defines a series of pathways, collectively known as The Just Culture Algorithm,TM to guide and standardize organizational response to errors. He brings an engineer's knowledge of human performance and fallibility to this work, differentiating "simple human error" (the mistakes people can be predicted to make when performing familiar tasks) from behavioral choices that are more volitional in nature. Engineers interested in preventing harm generally work to improve the systems people use to produce high stakes work. This approach stems from knowledge that humans, no matter how well intended, will not perform perfectly.

To illustrate, let's look at a very serious medication error that has been reported over and over again: a clinician inadvertently administers an oral medication (prepared for administration in a parenteral syringe) via the intravenous (IV) route.[1] In most of these cases, clinicians do not make conscious choices to circumvent processes normally used to administer oral medications nor is there any intent to harm a patient. Rather, in the course of performing a series of routine steps, the individual erroneously connects the syringe containing the oral medication to the patient's IV tubing, delivering the intended medication via the wrong route.

What distinguishes the Just Culture AlgorithmTM from other approaches to error investigation and corrective action is that individuals are not punished for simple human errors like these. Simple human errors (termed "slips, trips, and lapses") occur most often when competent people become distracted or experience sensory overload. Preventing another person from making the same error involves re-engineering processes so that barriers, redundancies, and opportunities to discover errors before harm occurs become part of clinicians' daily practice norms.

To prevent wrong-route administration of oral medications, the most robust, error-averse process relies on consistent use of a device constraint: oral syringes -- devices with tips that are not compatible with the Luer-lock connections found on parenteral lines and access ports. Meticulous auxiliary labeling of all lines, tubes, and drains near their ports of entry is also part of effective "wrong-route" error-prevention processes.[1]

At-Risk vs Reckless Behavior

Errors are not always rooted in the predictable slips, trips, and lapses that humans make. Some arise because people knowingly choose to circumvent defined safety practices. These choices either increase the likelihood of error or they remove downstream opportunities to catch errors. Marx places behavioral choices that increase risk into 2 broad categories: "at-risk behavior" or "reckless behavior."

A pharmacist who chooses to dispense an oral medication in a parenteral syringe increases the likelihood that a wrong route administration error will occur. So does a nurse who chooses not to label the distal ends of a tube or line. In evaluating these choices, though, the Just Culture AlgorithmTM considers system level design and barriers to compliance as well as individual motivation. For example, the motivation of a pharmacist who chooses to dispense oral medication in a parenteral syringes because he or she believes that nurses need to be "more careful" and shouldn't depend upon a device "crutch" is motivated differently than a pharmacist who uses parenteral syringes because oral syringes are not in stock. Seeing a substantial risk and choosing to ignore it is reckless whereas deviating under the belief that the right thing is being done is an at-risk choice.

Distinctions like these are important, both for determining a just course of action for the individual involved in the safety violation and for strengthening the safety processes an organization defines and implements. Systems and circumstance promote at-risk behavior, as occurs when a nurse does not confirm two patient identifiers before giving a medication to someone who appears to be an obviously known patient. In contrast, a nurse engages in reckless behavior when choosing to ignore an obvious risk, such as signing for medications not given to the patient. The distinction is important to the Just CultureTM model, that reckless behavior is both qualitatively and quantitatively different from human error and at-risk behavior.

What makes Marx's models most remarkable, I think, is that organizational response in the aftermath of an error focuses on what drove the adverse event (simple human error, at-risk behavior, or reckless behavior) and not upon the severity of the outcome. This means that a person exhibiting reckless behavior faces punishment, irrespective of whether grievous patient harm occurs. Conversely, an event that leads to substantial harm is not "charged" to the professional whose simple human error reaches a patient undetected. Staying true to his engineering roots, in cases like this, Marx cycles the feedback loop around, holding leaders accountable for managing compliance barriers and, when necessary, redesigning faulty systems so that the likelihood of a simple human error causing grave harm is substantially reduced. In the safe practice example involving use of oral syringes, leaders are accountable for system design (defining the use of oral syringes as part of standard safety measures for dispensing oral liquids) and for ensuring that the equipment (oral syringes) is readily available to frontline clinicians.

But a Just Culture, as defined by Marx, has zero tolerance for reckless behavior, a characteristic that differentiates this philosophy from "no blame" approaches. People are held accountable -- subject to the disciplinary processes used in their organizations -- for patterns of at-risk behavior that jeopardize safety and for individual acts that violate organizational values and substantially endanger others, irrespective of whether harm occurs.

While I find Marx's work to be both authentic and satisfying, I recognize it may challenge beliefs held by seasoned healthcare professionals who, like me, came of age in an era when "no harm, no foul" ruled process improvement efforts. I've partnered with Dave and his team at Outcome Engineering on a few occasions and have seen his culture modeling welcomed by a wide array of patient safety stakeholders, including directors and managers of clinical service lines and departments. The soundness of Marx's logic, coupled with his rich story-telling abilities, seems to give healthcare leaders the transformational ("aha") moments needed to cultivate open reporting cultures while holding workers accountable for their behavioral choices. So I thought that a conversation with Dave might be interesting to others who are striving to balance these values.


Comments on Medscape are moderated and should be professional in tone and on topic. You must declare any conflicts of interest related to your comments and responses. Please see our Commenting Guide for further information. We reserve the right to remove posts at our sole discretion.