Black Box Thinking

Black Box Thinking: Marginal Gains and the Secrets of High Performance by Matthew Syed

🚀 The Book in 3 Sentences

  1. We have a tendency to view failure as bad and to ignore it as best we can.

  2. However, by redefining failure, we can see it has so much upside and potential in all of our lives.

  3. By changing our views about failure, we can harness the growth, learning and progress to improve our lives.

🎨 Impressions

This is a book about how success happens - it is a riveting, page-turning book that gets you to re-think so much of what we know.

The case studies and examples explore the most pioneering and innovative organisations in the world, including Google, Team Sky, Pixar and the Mercedes Formula One team as well as exceptional individuals like the basketball player Michael Jordan, the inventor James Dyson, and the football star David Beckham.

How I Discovered It

I read Rebel Ideas about 6 months earlier and really enjoyed it so was keen to read more of Matthew's work.

Who Should Read It?

If you want to understand the power of your mistakes, harness the growth that comes from doing things wrong and realise the potential of failure - this is the book. A great book covering wide range of areas for improving your thinking and creating a better culture for high performance teams.

✍️ My Top 3 Quotes

Learn from the mistakes of others. You can’t live long enough to make them all yourself.

It is partly because we are so willing to blame others for their mistakes that we are so keen to conceal our own.

Learning from failure has the status of a cliché. But it turns out that, for reasons both prosaic and profound, a failure to learn from mistakes has been one of the single greatest obstacles to human progress.

📒 Summary + Notes

Black Box Thinking

In aviation, there once were many accidents and crashes. Today, however, things are very different. In 2013, there were 36.4 million commercial flights worldwide carrying more than 3 billion passengers, according to the International Air Transport Association. Only 210 people died. For every one million flights on western-built jets there were 0.41 accidents – a rate of one accident per 2.4 million flights.

However, the same improvements have not been seen in the healthcare industry. Pronovost, professor at the Johns Hopkins University School of Medicine and one of the most respected clinicians in the world, pointed out that the number of preventable fatalities is the equivalent of two jumbo jets falling out of the sky every twenty-four hours.

In the UK healthcare system the numbers are also alarming. A report by the National Audit Office in 2005 estimated that up to 34,000 people are killed per year due to human error. It put the overall number of patient incidents (fatal and non fatal) at 974,000. A study into acute care in hospitals found that one in every ten patients is killed or injured as a consequence of medical error or institutional shortcomings. French healthcare put the number even higher, at 14 per cent.

There is also something deeper and more subtle at work, than simply resources, and everything to do with culture. It turns out that many of the errors committed in hospitals (and in other areas of life) have particular trajectories, subtle but predictable patterns: Learning from failure has the status of a cliché. But it turns out that, for reasons both prosaic and profound, a failure to learn from mistakes has been one of the single greatest obstacles to human progress.

The Logic of Failure

This book aims to share thoughts on: how we respond to failure, as individuals, as businesses, as societies. How do we deal with it, and learn from it? How do we react when something has gone wrong, whether because of a slip, a lapse, an error of commission or omission, or a collective failure

We have a deep instinct to find scapegoats. It is partly because we are so willing to blame others for their mistakes that we are so keen to conceal our own. We anticipate, with remarkable clarity, how people will react, how they will point the finger, how little time they will take to put themselves in the tough, high-pressure situation in which the error occurred. The net effect is simple: it obliterates openness and spawns cover-ups. It destroys the vital information we need in order to learn.

We cover up mistakes, not only to protect ourselves from others, but to protect us from ourselves. This basic perspective – that failure is profoundly negative, something to be ashamed of in ourselves, and judgemental about in others – has deep cultural and psychological roots.

Therefore, we need to redefine our relationship with failure, as individuals, as organisations, and as societies. Only by redefining failure will we unleash progress, creativity and resilience.

A closed loop is where failure doesn’t lead to progress because information on errors and weaknesses is misinterpreted or ignored; an open loop does lead to progress because the feedback is rationally acted upon.

‘Learn from the mistakes of others. You can’t live long enough to make them all yourself.’

Black Box thinking: For organisations beyond aviation, it is not about creating a literal black box; rather, it is about the willingness and tenacity to investigate the lessons that often exist when we fail, but which we rarely exploit. It is about creating systems and cultures that enable organisations to learn from errors, rather than being threatened by them.

The more we can fail in practice, the more we can learn, enabling us to succeed when it really matters.

Everything we know in aviation, every rule in the rule book, every procedure we have, we know because someone somewhere died. We have purchased at great cost, lessons literally bought with blood that we have to preserve as institutional knowledge and pass on to succeeding generations. We cannot have the moral failure of forgetting these lessons and have to relearn them.

We often don’t learn from our mistakes in other areas, and often the reason is not usually laziness or unwillingness. The reason is more often that the necessary knowledge has not been translated into a simple, usable and systematic form.

Cognitive Dissonance

We cannot learn if we close our eyes to inconvenient truths, but we will see that this is precisely what the human mind is wired up to do.

When we are confronted with evidence that challenges our deeply held beliefs we are more likely to reframe the evidence than we are to alter our beliefs. We simply invent new reasons, new justifications, new explanations. Sometimes we ignore the evidence altogether.

The problem is not the strength of the evidence, which is often overwhelming, it is the psychological difficulty in accepting it.

This is the domino effect of cognitive dissonance: the reframing process takes on a life of its own. In the United States there are grave consequences when an airplane falls from the sky. Serious inquiries are made: what went wrong? Was it a systemic breakdown? An individual’s mistake? Was there official misconduct? Can anything be done to prevent it from happening again? .But America keeps virtually no records when a conviction is vacated based on new evidence of innocence. Judges typically write one-line orders, not official opinions, meaning that they don’t analyse what went wrong. Neither does anyone else.

If we edit out failure, if we reframe our mistakes, we are effectively destroying one of the most precious learning opportunities that exists.

The relationship between the ambiguity of our failures and cognitive dissonance: When a plane has crashed, it’s difficult to pretend the system worked just fine. The failure is too stark, too dramatic. This is what engineers call a red flag: a feature of the physical world that says ‘you are going wrong’. It is like driving to a friend’s house, taking a wrong turn, and hitting a dead end. You have to turn around. Most failure is not like that. Most failure can be given a makeover. You can latch on to any number of justifications: ‘it was a one-off’, ‘it was a unique case’, ‘we did everything we could’. You can selectively cite statistics that justify your case, while ignoring the statistics that don’t. You can find new justifications that did not even occur to you at the time, and which you would probably have dismissed until they – thankfully, conveniently – came to your rescue.

It concerns how our culture’s stigmatising attitude towards error undermines our capacity to see evidence in a clear-eyed way. It is about big decisions and small judgements: indeed, anything that threatens one’s self-esteem.

And this takes us back to perhaps the most paradoxical aspect of cognitive dissonance. It is precisely those thinkers who are most renowned, who are famous for their brilliant minds, who have the most to lose from mistakes. And that is why it is often the most influential people, those who ought to be in the best position to help the world learn from new evidence, who have the greatest incentive to reframe it. And these are also the kinds of people (or institutions) who often have the capacity to employ expensive PR firms to bolster their post hoc justifications. They have the financial means, in addition to a powerful subconscious urge, to bridge the gap between beliefs and evidence, not by learning, but by spinning.

Ironically, the more famous the expert, the less accurate his or her predictions tended to be. Why is this? Cognitive dissonance gives us the answer. It is those who are the most publicly associated with their predictions, whose livelihoods and egos are bound up with their expertise, who are most likely to reframe their mistakes – and who are thus the least likely to learn from them.

Also, memory is so malleable that it may lead us astray when it comes to recollection. But it could also play a crucial role in imagining and anticipating future events. We try to make the memory fit with what we now know rather than what we once saw.

Summary Sentence: Cognitive dissonance occurs when mistakes are too threatening to admit to, so they are reframed or ignored. This can be thought of as the internal fear of failure: how we struggle to admit mistakes to ourselves.

Confronting Complexity

We should practically test ideas to learn from failure. Theoretical knowledge is not worthless. Quite the reverse. A conceptual framework is vital even for the most practical men going about their business. In many circumstances, new theories have led to direct technological breakthroughs (such as the atom bomb emerging from the Theory of Relativity). The real issue here is speed. Theoretical change is itself driven by a feedback mechanism, science learns from failure. But when a theory fails, like say when the Unilever mathematicians failed in their attempt to create an efficient nozzle design, it takes time to come up with a new, all-encompassing theory. To gain practical knowledge, however, you just need to try a different-sized aperture. Tinkering, tweaking, learning from practical mistakes: all have speed on their side. Theoretical leaps, while prodigious, are far less frequent.

In general, we are hardwired to think that the world is simpler than it really is. And if the world is simple, why bother to conduct tests? If we already have the answers, why would we feel inclined to challenge them?

That is the power of the narrative fallacy. We are so eager to impose patterns upon what we see, so hardwired to provide explanations, that we are capable of ‘explaining’ opposite outcomes with the same cause without noticing the inconsistency.

Closed loops are often perpetuated by people covering up mistakes. They are also kept in place when people spin their mistakes, rather than confronting them head on. But there is a third way that closed loops are sustained over time: through skewed interpretation.

‘The Randomised Control Trial is one of the greatest inventions of modern science.’ People were convinced of the success of Scared Straight because it seemed so intuitive. People loved the idea that kids could be turned around through a tough session with a group of lifers. But crime turns out to be more complex than that. Children commit offences for many different, often subtle reasons. With hindsight, a three-hour visit to prison was unlikely to solve the problem. The intentions of the inmates were genuine: they really wanted the kids to go straight. But the programme was having unintended consequences. The experience of being shouted at seemed to be brutalising the youngsters. Many seemed to be going out and committing crime just to prove to themselves and their peers that they weren’t really scared.

The glitzy narrative was far more seductive than the boring old data.

Policy, almost across the board, is run on narrative, hunch, untested ideology, and observational data skewed to fit predetermined conclusions. Not control tested experiments and fact.

Small Steps and Giant Leaps

Marginal gains is not about making small changes and hoping they fly. Rather, it is about breaking down a big problem into small parts in order to rigorously establish what works and what doesn’t. Every error, every flaw, every failure, however small, is a marginal gain in disguise. This information is regarded not as a threat but as an opportunity.

But a willingness to test assumptions is ultimately about a mindset. Marginal gains is a strategy of local optimisation: it takes you to the summit of the first hill. But once you are there, taking little steps, however well tested, runs out of traction. To have stayed ahead of the competition, Blockbuster would have needed to move into an entirely new space, leveraging new technology and fresh insights.

Removing failure from innovation is like removing oxygen from a fire. Think back to Dyson and his Hoover. It was the flaw in the existing technology that forced Dyson to think about cleaning in a new way. The blockage in the filter wasn’t something to hide away from, or pretend wasn’t there. Rather, the blockage, the failure, was a gilt-edged invitation to re-imagine vacuum-cleaning. Imagination is not fragile. It feeds off flaws, difficulties and problems. Insulating ourselves from failures – whether via brainstorming guidelines, the familiar cultural taboo on criticism or the influence of cognitive dissonance – is to rob one of our most valuable mental faculties of fuel.

If insight is about the big picture, development is about the small picture. The trick is to sustain both perspectives at the same time.

We learn not just by being correct, but also by being wrong. It is when we fail that we learn new things, push the boundaries, and become more creative.

We live in a world of experts. There is nothing particularly wrong with that. The expertise we have developed is crucial for all of us. But when we are trying to solve new problems, in business or technology, we need to reach beyond our current expertise. We do not want to know how to apply the rules; we want to break the rules. We do that by failing – and learning.

The Blame Game

Blame tends to be a subversion of the narrative fallacy: an oversimplification driven by biases in the human brain. It has subtle but measurable consequences, undermining our capacity to learn.

If our first reaction is to assume that the person closest to a mistake has been negligent or malign, then blame will flow freely and the anticipation of blame will cause people to cover up their mistakes. But if our first reaction is to regard error as a learning opportunity, then we will be motivated to investigate what really happened.

But trying to increase discipline and accountability in the absence of a just culture has precisely the opposite effect. It destroys morale, increases defensiveness and drives vital information deep underground.

Creating a Growth Culture

It is striking how often successful people have a counter-intuitive perspective on failure. They strive to succeed, like everyone else, but they are intimately aware of how indispensable failure is to the overall process. And they embrace, rather than shy away from, this part of the journey.

It is when a culture has an unhealthy attitude to mistakes that blame is common, cover-ups are normal and people fear to take sensible risks. When this attitude flips, blame is less likely to be pre-emptive, openness is fostered, and cover-ups are seen for what they are: blatant self-sabotage. if we drop out when we hit problems, progress is scuppered, no matter how talented we are. If we interpret difficulties as indictments of who we are, rather than as pathways to progress, we will run a mile from failure.

When we see failure without its related stigma, the point is not that we commit to futile tasks, but that we are more capable of meaningful adaptation: whether that means quitting and trying something else or sticking – and growing

Conclusions

If we wish to fulfil our potential as individuals and organisations, we must redefine failure.

At the level of the brain, the individual, the organisation and the system, failure is a means – sometimes the only means – of learning, progressing and becoming more creative. This is a hallmark of science, where errors point to how theories can be reformed; of sport, where practice could be defined as the willingness to clock up well-calibrated mistakes; of aviation, where every accident is harnessed as a means of driving system safety.

Again and again, differences in mindset explain why some individuals and organisations grow faster than others. Evolution, is driven by failure. But if we give up when we fail, or if we edit out our mistakes, we halt our progress no matter how smart we are. It is the Growth Mindset fused with an enlightened evolutionary system that helps to unlock our potential; it is the framework that drives personal and organisational adaptation.

But self-handicapping is more sophisticated. This is where the excuse is not cobbled together after the event, but actively engineered beforehand. It is, in effect, a pre-emptive dissonance-reducing strategy. As an example, if students decided to party the night before an example, and they flunked their crucial exam, they could say: ‘It wasn’t me who messed up, it was the booze!’ It served another purpose, too: if they did pass the exam, they could still point to alcohol in mitigation for why they didn’t get an even higher grade.

It is precisely because the project or idea really matters that failure is so threatening – and why they desperately need an alternative explanation for messing up.

When we are fearful of being wrong, when the desire to protect the status quo is particularly strong, mistakes can persist in plain sight almost indefinitely.

Religion was fixed in its thinking about the natural world. Knowledge was revealed from above rather than discovered through a process of learning from mistakes. That is why progress was so slow for not merely decades, but centuries.

Failure is a natural part of life and we should look to lean from it, instead of the desire to avoid it which leads to stagnation.

Previous
Previous

Flow: The Psychology of Happiness

Next
Next

Everything is F*cked