The Reporting Paradox: What a Quiet System Is Actually Telling You
Edmondson's 1999 study of nursing teams produced a finding that still surprises healthcare leaders: the best-performing teams reported more errors, not fewer. A quiet system is rarely a safe one.
When Amy Edmondson published her 1999 study of nursing teams in eight hospital units, she expected to find that the highest-performing teams would report the fewest medication errors. The result was the opposite, and it inverted a generation of assumptions about safety. The best teams reported more errors, not fewer. The error rate, as best as could be measured, was roughly the same across teams. What differed was the willingness to disclose.
Two and a half decades later, this finding still has not fully landed in healthcare. Many leaders continue to read low incident report numbers as a sign that the unit is running well. The research is unambiguous on what those numbers more often mean. A quiet system is, with troubling frequency, a frightened one.
What gets metabolized when voice is not safe
When a clinician notices something wrong and does not say it out loud, the problem does not vanish. It gets metabolized. It becomes a private workaround that adds friction to the next shift. It becomes a quiet resentment that compounds across months. It becomes the staffing pattern everyone knows is unsafe and no one is willing to name. It becomes, eventually, the harm event that surprises leadership because nothing on the dashboard predicted it.
This is the cost of suppressed voice, and it is not abstract. It shows up in higher attrition among the people most attentive to safety. It shows up in delayed identification of system failures. It shows up in the gap between what the patient experiences at the bedside and what the quality committee believes is happening on the floor. The committee is not lying. It is reading inputs that have been filtered through a culture in which speaking up is not safe enough or worth enough to risk.
The compassion translation
The compassion literature and the patient safety literature have been telling the same story from different angles. Compassion, properly understood, is a posture toward the suffering of another person that combines accurate perception with willingness to act (Singer & Klimecki, 2014). Psychological safety is the climate that makes that perception and that willingness institutionally possible. When clinicians cannot tell the truth about what is happening to their patients, to their colleagues, and to themselves, the perception is blocked at its source. Compassion cannot operate on information it does not receive.
This is why the conviction that culture beats statements is not a slogan. An organization can adopt every compassion framework in the literature and continue to suppress the voice that would let it act on what those frameworks describe. The same is true in reverse. An organization that has built genuine psychological safety will produce inputs that any reasonable framework can act on, because the truth is being told in a form leaders can hear.
Why this is counterintuitive for leaders
Most leaders rose through a system that rewarded the appearance of order. A unit with few complaints, low turnover discussions, and a quiet incident log was assumed to be functioning well, and the leader running it was assumed to be effective. The trouble is that this signal pattern is exactly what fear produces.
A genuinely high-functioning unit, in the sense that the patient safety literature now describes, looks somewhat different from the outside. There is more visible disagreement, because people are willing to surface it. There are more reported near misses, because people are willing to name them before they become events. There are more conversations about staffing, workload, and the pressures of the day, because people believe those conversations might lead somewhere. The leader running such a unit is often described by their staff as someone who can be told the truth.
The shift in mindset required of healthcare leaders is uncomfortable. It asks them to be suspicious of quiet, and to read a rising incident report rate as evidence that the culture is improving rather than deteriorating. This is not intuitive. It is, however, what the research consistently supports (Edmondson & Lei, 2014; O'Donovan & McAuliffe, 2020).
What this looks like on a Tuesday
The compassionate leader who internalizes the reporting paradox does small, observable things differently. They thank the person who reports the near miss before they ask any procedural questions. They distinguish, in front of the team, between the person who erred and the system that produced the conditions for the error. They name, out loud and on the record, the times they themselves have made mistakes and what those mistakes taught them. They treat reporting rates as an indicator of trust rather than an indicator of risk, and they say this to their workforce in language the workforce can repeat back.
These behaviors are not therapeutic gestures. They are structural inputs into the climate that determines whether the next clinician who notices something wrong will say it. The cumulative effect, over months, is a unit that is genuinely safer because it is genuinely audible.
The metric that actually matters
If your incident reports are going down and your patient outcomes are not improving, your culture is not getting safer. Your reporting is. This is the leadership question that most healthcare organizations are not yet asking, and it is the question that distinguishes a unit running on fear from a unit running on trust.
The instruments to measure psychological safety exist and are validated for clinical settings (Edmondson, 1999; Edmondson, 2018). The data that pairs psychological safety scores with incident reporting patterns over time is increasingly available. The leadership move is to put both in front of the team, treat them as the leading indicators they actually are, and stop confusing silence for safety.
Where this connects
This is the same argument the compassion literature has been making about empathy and detachment. A clinician who has armored themselves against the suffering of their patients will appear, by some measures, to be coping. They will produce fewer obvious markers of distress. They will not say uncomfortable things in team meetings. The cost of that armor, in the form of disengagement, depersonalization, and eventual exit from the workforce, is paid later and elsewhere (Singer & Klimecki, 2014). The unit that has armored itself against its own truth produces the same pattern at the organizational level.
Compassion at the systems level is the willingness to receive information the system would prefer not to have. Psychological safety is the climate that lets that information arrive. Voice is the act that produces the information. They are not three different conversations. They are one conversation, viewed from three angles.
The work, for healthcare leaders who take this seriously, is to stop treating quiet as success and start treating it as a question. The question is whether the unit is genuinely safe enough that nothing needs to be said, or whether it has simply become unsafe to say anything. The answer determines almost everything else that follows.
Care differently, not less.
References
- Edmondson, A. C. (1999). Psychological safety and learning behavior in work teams. Administrative Science Quarterly, 44(2), 350-383. https://doi.org/10.2307/2666999
- Edmondson, A. C. (2018). The fearless organization: Creating psychological safety in the workplace for learning, innovation, and growth. Wiley.
- Edmondson, A. C., & Lei, Z. (2014). Psychological safety: The history, renaissance, and future of an interpersonal construct. Annual Review of Organizational Psychology and Organizational Behavior, 1, 23-43. https://doi.org/10.1146/annurev-orgpsych-031413-091305
- O'Donovan, R., & McAuliffe, E. (2020). A systematic review exploring the content and outcomes of interventions to improve psychological safety, speaking up, and voice behaviour. BMC Health Services Research, 20, 101. https://doi.org/10.1186/s12913-020-4931-2
- Singer, T., & Klimecki, O. M. (2014). Empathy and compassion. Current Biology, 24(18), R875-R878. https://doi.org/10.1016/j.cub.2014.06.054
Continue Reading
Why Your Last Wellness Program Failed
Most organizations have run compassion programs. Few have built cultivation systems. Six elements separate the two.
Compassion Culture and Patient Safety Are the Same Culture
Patient safety and staff well-being are not parallel investments. They are the same cultural fabric.