Proactive Patient Safety Initiatives: Just Culture

People make errors. Errors can cause accidents. In healthcare, errors and accidents result in morbidity and adverse outcomes and sometimes in mortality.

One organizational approach has been to seek out errors and identify the responsible individual. Individual punishment follows. This punitive approach does not solve the problem. People function within systems designed by an organization. An individual may be at fault, but frequently the system is also at fault. Punishing people without changing the system only perpetuates the problem rather than solving it.

A patient care system is obligated to collect productive investigative data that can be analyzed and acted upon to improve patient safety. This process is not possible unless members of the organization remain vigilant and mindful and maintain continuous surveillance. Similarly, people within the organization must believe that they are obligated to report errors. However, medical institutions cannot afford a blame-free culture: Some errors do warrant disciplinary action. Finding a balance between the extremes of punishment and blamelessness is the goal of developing a just culture.

A just culture balances the need for an open and honest reporting environment with the end of a quality learning environment and culture. While the organization has a duty and responsibility to employees (and ultimately to patients), all employees are held responsible for the quality of their choices. Just culture requires a change in focus from errors and outcomes to system design and management of the behavioral choices of all employees.

These examples address an aspect of just culture that goes beyond ensuring that employees feel free to report errors. Highly reliable organizations and industries foster mindfulness in their workers. Weick and Sutcliffe describe mindfulness in terms of 5 components:

  1. A constant concern about the possibility of failure
  2. Deference to expertise regardless of rank or status
  3. Ability to adapt when the unexpected occurs
  4. Ability to concentrate on a task while having a sense of the big picture
  5. Ability to alter and flatten the hierarchy to fit a specific situation

Mindfulness throughout an organization considers, but moves beyond, events and occurrences. Everyone in the organization is continually learning, adjusting, and redesigning systems for safety and managing behavioral choices.


Functional issues must be addressed in a just culture.18,19 While encouraging personnel to report mistakes, identify the potential for error, and even stop work in acute situations, a just culture cannot be a blame-free enterprise. Competency is one issue, but even competent professionals make mistakes. In complex, dynamic, and inherently risky environments, society has an expectation for human performance.

After identifying an adverse event or near miss, the investigation procedure technique becomes paramount (Figure 2). Two approaches have been successfully implemented in hospitals.

Figure 2.

Caregivers’ actions and recommended responses for analyzing risks constructed from outcome engineering principles and analysis of human factors. (From Leonard and Frankel19 with permission.)

Outcome Engineering Algorithm

Marx2 coined the term outcome engineering that is applicable to healthcare and other industries. His algorithm lists 3 basic duties:

  1. Duty to produce an outcome
  2. Duty to follow a procedural role
  3. Duty to avoid unjustifiable risk

Consequently, with breach of duty, the mechanism of the breach is categorized into one of the following:

  1. Human error
  2. At-risk behavior (a conscious drift from safe behavior)
  3. Reckless behavior (conscious of conduct and risk)

Reason12 highlights the notion of intent when considering the nature of error. Slips (eg, Freudian slips) lack intention; that is, the actions are not carried out as intended or planned. Lapses are missed actions or omissions, with the perpetrator often conscious of the action and believing that it will not lead to harm.

Mistakes involve error, ie, faulty planning or intention; the individual involved believes the action to be correct. Corrective action and coaching, not punishment, are indicated for improving the system.

At-risk behavior includes both intention and the violation of rules, policies, and procedures and makes a system vulnerable, increasing risk. The individual should be coached to understand the risks resulting from his or her action.

Reckless behavior may be grounds for disciplinary action, and civil or criminal charges may be filed against the individual. Punishment, including termination, may be the appropriate consequence.

Repetitive patient safety problems must be addressed, whether caused by individual error or system weakness.20

United Kingdom National Health Service Algorithm

The National Health Service in the United Kingdom and the National Patient Safety Agency published an incident decision tree and a guide for its use.21 The algorithm on which the decision tree is based identifies the role of an individual and the given specific outcome. The decision tree has 4 main elements:

  • The deliberate harm test: a conscious and deliberate breach of duty resulting in patient harm. The goal of the institution or system is to establish or refute this violation immediately as a first step.
  • The physical/mental health test: a provider is impaired for any reason, including substance abuse. The impact of impairment or the patient outcome must be established.
  • The foresight test: once the deliberate intent to harm and physical/mental health tests have been discounted, this analysis establishes whether protocols, policies, and procedures have been followed.
  • The substitution test: this test asks the question, “Would another provider put in the same circumstances in the same systems environment make the same error?”

Combination of Models

Because both algorithms have merits, Leonard and Frankel19 schematically integrated them into a single format that involves a 3-step process. The first step analyzes the individual caregiver’s actions via 5 measures: impaired judgment, malicious action, reckless action, risky action, and unintentional error. The second step determines if other caregivers with similar skills and knowledge would react the same way in similar circumstances. The final step is the important determination of whether the present system supports reckless or risky behavior and thus requires redesign.


A fair and just culture improves patient safety by empowering employees to proactively monitor the workplace and participate in safety efforts in the work environment. Improving patient safety reduces risk by its focus on managing human behavior (or helping others to manage their own behavior) and redesigning systems. In a just culture, employees are not only accountable for their actions and choices, but they are also accountable to each other, which may help some overcome the inherent resistance to dealing with impaired or incompetent colleagues.22

Secondary benefits of a just culture include the ability to develop a positive patient safety profile to respond to outside auditors such as The Joint Commission.23 When implemented, a just culture fosters innovation and cross-departmental communication. An example is the opportunity to revitalize the morbidity and mortality conference to cross specialty lines and develop a patient-centered focus.24

In a just culture, both the organization and its people are held accountable while focusing on risk, systems design, human behavior, and patient safety.



Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s