Learning From Mistakes Before You Make Them

Recent years have seen a significant shift in attitudes towards mistakes, with a common belief now that running the risk of making mistakes is crucial for innovation to happen, as it’s only when you push the boundaries of what you can do that you progress.

What’s more, it’s the learning of what went wrong in the making of the mistake that helps individuals, teams, and organizations to get better, and which makes the whole process so worthwhile.

Nonetheless, it’s equally fair to say that few workplaces have an entirely relaxed attitude to the making of mistakes and you still sense that many would be far happier if no mistakes were made at all, even if that meant the boundaries were not pushed anywhere near as much.

Learning from near misses

A new paper from Harvard Business School explores whether it might be possible to generate the learning we would gain from making mistakes, without actually having to make them.

The research, which was led by the doyenne of psychological safety Amy Edmondson, examined how people respond to close shaves in a healthcare setting.  It suggests that when people feel comfortable about speaking up, even events that are averted can spark growth and improvement.

The authors argue that in the working world, we tend to focus almost exclusively on things that worked as planned or went wrong in some way.  They believe that near misses provide invaluable opportunities for learning and development, without having to go through the pain of failure itself.

The spectrum of close calls

The study took place in the radiation oncology department in a hospital, and the aim was to understand the likelihood that staff would report any near misses they encountered, and indeed the factors that underpin such a decision.

“What’s interesting about a near miss is that it can be thought of as a failure, where people say, ‘Oh, we almost made a huge mistake,’” the researchers say. “That interpretation highlights a vulnerability in the care-delivery processes. But it can also be thought of as a success, where they say, ‘Whew, we caught the error and delivered great care,’ which highlights resilience of care delivery systems.”

The researchers quizzed 78 radiation oncology staff, firstly about their perceived psychological safety at work before presenting them with a number of hypothetical ner miss scenarios that were based on real-life practice.

The survey found that staff generally felt accountable to each other, which helped to support a culture of speaking up.  This did vary, however, by position, with higher-ranking employees far more comfortable speaking up than their lower-ranked peers.  This is common across analyses of psychological safety and can inhibit frank disclosure, especially from low-ranking employees towards high-ranking employees.

The participants were asked to rank the near-miss scenarios in order of the likelihood that they would report them, with each scenario becoming gradually more threatening to the patient.

Causing harm

The results revealed that the closer the scenario got to actually causing harm to the patient, the far more important psychological safety came for whether employees spoke up or not about the near-miss.

“With near misses that we characterize as ‘could have happened,’ where the chance event is far from patient harm, and therefore highlights resilience, we find that the role of psychological safety on people’s willingness to report is almost negligible,” the researchers say. “But for near misses that we characterize as ‘nearly happened,’ which highlight vulnerability, we find there’s a huge effect of psychological safety on people’s willingness to report.”

The authors argue that these near misses work best because they highlight scary things that if they did actually go wrong would have significant repercussions.  They’re not the kind of near-miss that could be passed off as a success.

By being able to frame them as a learning opportunity, they believe that they can increase the prospect of employees reporting them, especially if allayed with concerted efforts to improve psychological safety.

“If you are a leader and you are framing good catches as examples of vigilance and resilience, and telling people, ‘It’s great when we speak up and catch problems, because nobody’s perfect; things do go wrong,’ then you’re more likely to hear about them,” they conclude. “But if you’re framing them as failures and screw-ups, you’re less likely to hear about them, because everybody knows, if you’re the person associated with screw-ups, you get in trouble.”

Facebooktwitterredditpinterestlinkedinmail