Just Culture and Its Critical Link to Patient Safety (Part II)
In our May 17, 2012 newsletter, we published Part I of a feature on Just Culture in which we shared key questions to help organizations assess their progress toward creating a Just Culture. We chose this topic primarily for two reasons: 1) an organization’s culture is a primary determinant regarding its success or failure with patient safety matters, and 2) hasty affirmations of a Just Culture without visible proof cause us to worry that “Just Culture” has simply become a popular catchphrase used by many, without fully understanding its key tenets and nuances, and its crucial link to patient safety.
In Part I, we shared probing questions associated with the organization’s values, fairness to the workforce, and reduction of at-risk behaviors. In Part II, presented here, we cover components associated with the establishment of an effective safety information system and learning environment. When applicable, we have included selected results from the 2012 report on the Agency for Healthcare Research and Quality (AHRQ) Hospital Survey on Patient Safety Culture (Table 1) to provide a national snapshot of where hospitals stand regarding certain aspects of a Just Culture. Although not inclusive or sufficient alone to presume a Just Culture, the questions in Part I and those that follow can help you assess your progress on this journey. We plan to present the final component of a Just Culture, designing reliable systems, in Part III of this feature in a future newsletter issue.
Safety Information System and Learning
Is there an effective patient safety information system that collects and analyzes information about hazards, at-risk behaviors, close calls (i.e., good catch), and errors both within the organization and externally? An organization operating under a Just Culture has established a well-functioning safety information system to address system and behavioral risks both before and after events occurs. This safety information system forms the nucleus of a learning culture in which people convert the lessons they learn into actions to improve safety.
One essential part of the safety information system is a user-friendly reporting system. This part is integral because understanding the types of risk, errors, and patient injuries and their causes is key to the development of effective preventive measures. The reporting system must gather the right kind of data about a hazard or error, including information about causal factors that may be included in the report or gathered during follow up. Without information about the underlying causes of risk and errors, the report holds little value for improving safety. Respondents to the AHRQ culture survey demonstrate another problem with many reporting systems: they fail to capture information about hazards and close calls. Respondents said that mistakes caught and corrected, and mistakes without the potential to cause harm, are reported just 57-59% of the time, respectively.
Another crucial component of a safety information system is the gathering and use of information about risks, errors, and patient injuries from external sources. Experience has shown that an error reported in one organization is likely to occur in another, given enough time and similar circumstances. Much knowledge can be gained when organizations look at external errors to learn from the experiences of others. Unfortunately, there are too many in healthcare who feel that if it hasn’t happened to them, the adverse experiences of others do not apply. Insights regarding patient safety are not possible in an organization with only an internal focus. Knowledge from the outside provides organizations with the lens they need to examine what they are doing, suggestions for what might be done differently, and a roadmap for improvement. Organizations do not want to wait until the event happens in their facilities before reacting.
Additional components of a safety information system observed in organizations operating under a Just Culture comprise alternative ways to learn about risks, errors, and outcomes that lead to patient harm. Examples include self-assessments, focus groups, surveys, concurrent or retrospective monitoring of triggers that may signal risk or errors, concurrent or retrospective audits, data from technology reports, targeted observations, check processes that lead to interventions, failure modes and effects analysis, and root cause analysis.
Are staff committed to safety and willing to report hazards, risks, close calls, and errors, thus arming the organization with an accessible body of safety information? A learning culture is dependent on the willingness of staff to report mistakes and observed risks that are often interpersonally and interdepartmentally troubling. An organization cannot learn if staff are uncomfortable discussing and analyzing mistakes. Thus, organizations that operate within a Just Culture have created an open and learning reporting environment in which staff are comfortable raising their hand when they have observed a hazard, cut a corner to achieve an organizational goal, or made a mistake. In fact, self-reporting is indicative of a Just Culture.
Unfortunately, many respondents who participated in the AHRQ culture survey expressed discomfort with reporting risk and errors. Half feel like the person is being written up, not the problem, and that mistakes are held against them. As a result, more than half of respondents admitted that they had not reported a single hazard, close call, or error during the past year, and another quarter reported only one or two in the past year. Low reporting reduces an organization’s opportunity to improve safety.
Another reason staff may not report risk or an error is a belief that nothing will change. Staff may abandon most efforts to report problems or change practices because attempts have continually been fruitless. Over time, people simply grow less willing to speak up. The power of observation is suppressed, and problems may go unnoticed. If they are noticed, the problems are often reasoned away rather than pursued. Because people want to be safe, they look for evidence of safety, not hazards, and they see a smaller number of errors reported as a positive factor.
Does the patient safety information system provide staff with knowledge of the current risks, errors, and prevention strategies necessary to improve safety? An organization operating under a Just Culture is willing to talk to its employees about risk and errors. The organization understands that one of the most fundamental strategies to create change is to share compelling stories about risk and errors and impactful change strategies, thus drawing attention to problems and encouraging people to act. Improvement strategies are most effective within a culture that ensures any changes are well understood, embraced, and sustained; employees must clearly see the risks in front of them in order to change. There is no better way to inspire and sustain change than through the simple craft of telling factual stories, ideally within the context of actual errors, that move listeners into action. They are an efficient vehicle for getting people to understand, remember, and accept new information. One thing is certain; lessons without stories rarely lead to learning and change. It is the contextual details and the exposed humanity in stories that serve as the catalyst for change.
The primary barriers to sharing stories about risk and errors in healthcare, particularly if a patient has been harmed, are legal and social (public disclosure) concerns. Organizations may be hesitant to tell their stories outside the confines of internal peer review/quality improvement processes, so staff feedback about risk and errors is narrowly focused on involved units and individuals. Respondents to the AHRQ culture survey suggest this is a widespread problem; only 56% said they are given feedback about changes put into place based on event reports, and only 65% said they are informed about errors that happen in their unit. This cloak of secrecy makes it virtually impossible for the entire organization to learn from its mistakes.
Does the organization seek long-term system remedies to safety problems? An organization operating under a Just Culture learns from both big and small failures and seeks remedies that can sustain safety rather than short-term fixes. It is not a culture in which workarounds are the dominant response to problems. Workarounds are symptomatic of first-order problem solving (short-term fixes) rather than second-order problem solving (long-term fixes). Whereas short-term remedies patch problems temporarily so work can be accomplished, long-term remedies change the underlying system and process, thus preventing recurrence.
In first-order problem solving, staff compensate for a problem by using any means possible to complete a task but do not discover or address underlying causes. Thus, the problem reappears, and the workarounds continue, risking patient safety. First-order problem solving is often rampant in healthcare. One study showed that nurses used first-order problem solving to implement short-term fixes for 93% of the failures they encountered;1 for example, they removed medications from an automated dispensing cabinet (ADC) for an entire shift when faced with congestion at the ADC. The nurses were often gratified that they were able to handle problems quickly and meet the daily challenges of patient care. However, problems were not reported to those who could investigate the causes and remedy them. Second-order problem solving, as seen in organizations operating under a Just Culture, involves both handling the unexpected problem and then taking steps to address its underlying causes. Nurses in the study reported problems to managers just 7% of time. It is likely that many other healthcare professionals, including physicians and pharmacists, behave similarly—managing the unexpected by using a short-term fix, but never reporting the problem.
Does the organization possess the willingness and competence to draw responsible conclusions from the organization’s safety information system so they can make substantial changes when necessary? Developing safety information systems for identifying and tracking hazards and errors is a first step, but it is not enough. Safety information systems have value only if they lead to changes that reduce risk and improve patient care. Thus, organizations operating under a Just Culture do not simply collect safety information; they devote sufficient resources and maintain highly skilled staff to analyze the data so they can understand the underlying behavioral and system causes of risk and errors. Reporting events and investigating events are very different tasks, and an assumption that a good reporting culture always translates into learning is unfounded. Organizations will not necessarily learn from reports unless staff, including frontline managers, have been properly trained and given adequate time to uncover why human error, at-risk behaviors, and other actions leading to adverse events occur. The failure to expertly analyze and, thus, appropriately act upon reported information may account for why 38% of respondents to the AHRQ survey reported that it is just by chance that serious mistakes don’t happen in their organization, and why only 64% said mistakes have led to positive changes.
Part III
A learning culture must be capable of acting. Without action at both a behavioral and system level, learning is meaningless. In Part III (in a future newsletter) we will discuss the final component of a Just Culture—designing reliable systems. We hope organizations will consider the questions posed in this three-part feature and take a hard look at their culture and how it compares to a Just Culture. “Just Culture” is so much more than a trendy metaphor for what was previously called a “non-punitive” or “blame-free” culture. It’s a robust set of values, beliefs, and actions that provide solid guidance on how an organization can best manage safety.
Reference:
- Tucker AL, Edmondson AC. Why hospitals don’t learn from failures; organizational and psychological dynamics that inhibit system change. Calif Manage Rev. 2003;45(2):55-72.