The Silence Of The Losers
During World War II, the US Air Force was desperate to solve an urgent problem: how to improve the odds of a bomber making it home. Chances were slim for bomber crews back in the day, almost like tossing a coin. So, the military thought of presenting the problem to statisticians and mathematicians to analyze it, probably due to the increasing interest in operations research during those hectic times.
Military engineers explained to the scientists—one of them being the brilliant mathematician Abraham Wald—that they knew their bombers needed more armor, but they couldn’t just cover the planes like tanks, not if they wanted them to get off the ground. So—engineers thought—the key was to figure out the best places to add what little protection they could.
For this, the military provided some information they have collected from the bombers that had returned from enemy territory, recording where those planes had taken the most damage. The collected data seemed to show that the bullet holes tended to accumulate along the wings, around the tail gunner, and down the center of the body. Wings. Body. Tail gunner.
Considering this information, the commanders wanted to put the thicker protection where they could clearly see the most frequent damage, where the holes clustered. But Wald pointed out that this would be precisely the wrong decision. The mistake, which Wald saw instantly, was that the holes showed where the planes were the strongest. The holes showed where a bomber could be shot and still survive the flight home. After all, here they were, holes and all. It was the planes that didn’t make it home that needed extra protection, and they had needed it in places that these planes had not. The holes in the surviving planes actually revealed the locations that needed the least additional armor. Look at where the survivors are unharmed, he said, and that’s where these bombers are most vulnerable; that’s where the planes that didn’t make it back were hit. In short, the returning bombers provided little amount of information on how to increase survivability. Sadly, those not coming back home did not only leave families devastated but also kept invaluable information with them.
Survivorship bias is our tendency to use evidence of success as the primary measure for planning for future successes. Survivorship bias pulls you toward bestselling diet gurus, celebrity CEOs, and superstar athletes. It’s an unavoidable tick, the desire to deconstruct success like a thieving magpie and pull away the shimmering bits. You look to the successful for clues about the hidden, about how to better live your life, about how you too can survive similar forces against which you too struggle. Colleges and conferences prefer speakers who shine as examples of making it through adversity, of struggling against the odds and winning. The problem here is that you rarely take away from these inspirational figures advice on what not to do, on what you should avoid, and that’s because they don’t know. Information like that is lost along with the people who don’t make it out of bad situations or who don’t make it on the cover of business magazines. In short, the advice business is a monopoly run by survivors.
As the psychologist Daniel Kahneman writes in his book Thinking Fast and Slow, “A stupid decision that works out well becomes a brilliant decision in hindsight.” The things successful companies like Microsoft or Google or Apple did right are like the planes with bullet holes in the wings. The companies that burned all the way to the ground after taking massive damage fade from memory. Kahneman adds: “If you group successes together and look for what makes them similar, the only real answer will be luck.”
(Adapted from here)