Leading Blog






03.12.18

Meltdown: Why Our Systems Fail and What We Can Do About It

Meltdown

W
E ARE IN the golden age of meltdowns, write Chris Clearfield and Andras Tilcsik. “More and more of our systems are in the danger zone, but our ability to manage them hasn’t quite caught up. The result: things fall apart.”

As systems become more complex, we are more vulnerable to unexpected system failures. In Meltdown, the authors examine a fatal D.C. Metro train accident, the Three Mile Island disaster, the collapse of Enron, the 2012 meltdown of Knight Capital, the Flint water crisis, and the 2017 Oscars mix-up, among other meltdowns, and discover that while these failures stem from very different problems, their underlying causes are surprisingly similar. These stories told here are a compelling look behind the scenes of why failures occur in today’s many complex systems.

Using sociologist professor Charles Perrow’s theory that as a system’s complexity and “tight coupling” (a lack of slack between different parts—no margin) increase the chance of a meltdown. In other words, these failures are driven by “the connections between the different parts, rather than the parts themselves.”

Some systems are linear and in these systems, the source of the breakdown is obvious. But as systems become complex, as at a nuclear power plant, the parts of the system interact in hidden and unexpected ways. Because these systems are more like a web, when they breakdown, it is difficult to figure out exactly what is wrong. And worse still, it is almost impossible to predict where it will go wrong and all of the possible consequences of even a small failure somewhere in the system.

As more and more of our systems become more complex and tightly coupled, what do you do? How do we keep up with our increasingly complex systems?

Oddly enough, safety features are not the answer. They become part of the system and thereby add to the complexity. And when something goes wrong, we like to add even more safety features into the system. “It’s like the old fable: cry wolf every eight minutes, and soon people will tune you out. Worse, when something does happen, constant alerts make it hard to sort out the important from the trivial.”

There are ways to make complex systems more transparent. For example, using premortems. Imagine in the future your project has failed. Write down all of the reasons why you think it happened. A 1989 study showed that premortems or prospective hindsight boost our ability to identify reasons why an outcome might occur and therefore deal with the potential problems before they occur.

We also should encourage feedback and sharing of failures and near-misses. “By openly sharing stories of failures and near failures—without blame or revenge—we can create a culture in which people view errors as an opportunity to learn rather than as the impetus for a witch hunt.”

Encourage dissent with a more open-leadership style. People in power tend to dismiss other’s opinions. Leaders should speak last. You have to work on the culture. Ironically, the authors note, introducing anonymous feedback actually highlights the dangers of speaking up.

Bring in outsiders and add diversity of thought. Outsiders will see things we don’t and are more willing to ask uncomfortable questions. Also in a more diverse environment, we tend to be more vigilant and question more. When we are around people just like us, we tend to trust their judgment which can lead to too much conformity. “Diversity is like a speed bump. It’s a nuisance, but it snaps us out of our comfort zone and makes it hard to barrel ahead without thinking. It saves us from ourselves.”

Transparent design matters. We need to see what is going on under the hood. Being able to see the state of a system by simply looking at it can be an important safeguard.

These are just a sampling of the ways we can learn to manage complex systems. This doesn’t mean we should take fewer risks. On the contrary, these solutions—structured decision tools, diverse teams, and norms that encourage healthy skepticism and dissent—“tend to fuel, rather than squelch, innovation and productivity. Adopting these solutions is a win-win.”

We can make our systems more forgiving of our mistakes by thinking critically and clearly about our own systems. How many things have to go right at the same time for this to work? Can we simplify it? How can we add margin?

* * *

Like us on Instagram and Facebook for additional leadership and personal development ideas.

* * *



Posted by Michael McKinney at 06:54 PM
| Comments (0) | This post is about Problem Solving



BUILD YOUR KNOWLEDGE


ADVERTISE WITH US



Books to Read

Best Books of 2023

Entrepreneurs

Leadership Books
How to Do Your Start-Up Right
STRAIGHT TALK FOR START-UPS



Explore More

Leadership Books
Grow Your Leadership Skills
NEW AND UPCOMING LEADERSHIP BOOKS

Leadership Minute
Leadership Minute
BITE-SIZE CONCEPTS YOU CAN CHEW ON

Leadership Classics
Classic Leadership Books
BOOKS TO READ BEFORE YOU LEAD


Email
Get the LEAD:OLOGY Newsletter delivered to your inbox.    
Follow us on: Twitter Facebook LinkedIn Instagram

© 2024 LeadershipNow™

All materials contained in https://www.LeadershipNow.com are protected by copyright and trademark laws and may not be used for any purpose whatsoever other than private, non-commercial viewing purposes. Derivative works and other unauthorized copying or use of stills, video footage, text or graphics is expressly prohibited. The Amazon links on this page are affiliate links. If you click through and purchase, we will receive a small commission on the sale. This link is provided for your convenience and importantly, help to support our work here. We appreciate your use of these links.