Why we can’t predict our failures (and how to solve it)

In mid-2001, Dean Kamen sat at his desk, excitedly thumbing through papers. After about a decade of tinkering, he was about to revolutionize one of the biggest problems of modern society: transportation. He promised his doubters over the years that his invention would change the world and that by the year 2002, humans would have a completely different view of the word transportation. With his revolution figured out, he just had to figure out logistics. Kamen had money already figured out, he built a nice fortune from inventing the drug infusion pump used in dialysis. He leased a 77,000-square-foot factory near his home and began hiring at a rapid pace, preparing for the onslaught of customers. And then, in December of 2001, the Segway was introduced, and a collective eyebrow was raised across the world. The flop was epic, and Kamen didn’t see it coming.

Why are we as humans so bad at predicting our failures? Psychologists have surfaced two cognitive biases that are at the heart of why we can be both such analytical creatures and also so naive about our own predictions.

Congruence and Decisions

The first principle at play is the Congruence bias, which is the ’tendency to test hypotheses exclusively through direct testing, instead of testing possible alternative hypotheses’. Summed up, once we feel that we are on to solving a problem, our brain narrows its focus on testing for that solution and doesn’t look back.

Even the scientific community is not immune to this bias. By the time a hypothesis is developed and approved, a great deal of time and effort is spent between grant writing, proposals, team-building, etc. This leads to a certain amount of ‘decision fatigue’ – even if exposed to a new, better hypothesis, it is difficult to see a weary researcher willing to go through the vetting process all over again. Better to stick with this hypothesis and test it, he or she would probably rationalize. And therein lies the problem. If this cognitive bias can breach the gold standard of the scientific method, what shot do us non-scientists have? Turns out, a perspective shift can fix this. But first, the other bias at play.

Availability Cascades

In the 1999 edition of the Stanford Law Review, two researchers put forth a bold claim. They proved that a little known cognitive bias was the cause of everything from public food scares to harmful health regulations. It was coined the availability heuristic, which leads to a phenomenon called availability cascades. Stripped of the fancy jargon, availability cascades are how humans create a snowball effect with other people’s perceived knowledge. This is a self-reinforcing process within all of us that if something is repeated long enough, it’s probability for truth increases.

While the paper focused on how large groups in society bring about false information, the same effects come into play in smaller groups when tasked with a decision.

When given a problem and tasked with finding its solution, the first tendency of a group is to lay out the already-known information in order to get closer to the unknowns. According to the availability cascade, when opinions and not-truthful information enter this arena, this is where the cognitive blind spots appear (and are most dangerous).

Darth+Vader+on+a+Segway2

Let’s go back to our friend Dean Kamen. To his credit, the Segway is not a complete failure. As a resident of Washington, DC, I’ve almost been hit by Segway tours by the Capitol a handful of times. While the actual meetings that planned out the launch of the Segway were top-secret, we can speculate how the availability cascade in Kamen’s initial discussions could have led to the failure of the product.

It could be argued that the largest generalization made by Kamen’s team is over-estimating how much humans dislike walking. You can imagine this being one of the first points brought up, shown with graphs of America’s obesity problem and followed by a series of collective head-nods. And that’s all it takes. The team working on the project had already made their fatal flaw as the availability cascades took over.

Fixing the biases

We can use the commonalities in both of these biases to attempt to solve for them. One of the most effective techniques is the one that places the largest emphasis on failure. The idea of the ‘premortem’ was published in Harvard Business Review and made its way to the boardrooms of Silicon Valley as an efficient way to prevent a project’s failure. A postmortem by definition is the examination and analysis of a dead body to determine the cause of death. In the business world, it’s used to figure out where a decision went wrong. The brilliance of the premortem is simple: analyzing how a project has failed before it actually launches.

To back this idea up, a study done in 1989 at Wharton found that imagining an event that has already occurred, also called prospective hindsight, “increased the ability to correctly identify reasons for future outcomes by 30%”.

The setup may be a bit strange for some teams to get used to, but it’s fairly easy. The project manager starts by getting the team in the room, stating that the project they are working on has failed miserably. The team then breaks into groups to figure out all the possible reasons why the project failed. The insights uncovered can then be used to make last-minute changes to ensure the success of the project.

What about when you aren’t making a decision with a group? Individual decisions can still fall prey to the same biases. I recall an uncomfortable conversation I had once with a nutritionist who pointed out the flaws in my diet. It turns out, the availability cascade had slowly invaded my thoughts with the notion that I would be infinitely healthier and more energetic without any carbohydrates in my diet. It had happened over the course of a few years, which is why the cascades are so dangerous. I most likely picked up a sentence here from an article here about why beans are bad for you, and then a sentence there about why rice is bad for you. And over time I built up this bias toward carbohydrates, when in reality, they are the most essential piece of energy. I recall my face turning an interesting shade of red during that conversation.

Psychologist Jonathan Baron, a professor of psychology at the University of Pennsylvania, suggests that you can avoid the congruence bias as an individual decision maker by figuring out the right tests. “Try to think of alternative hypotheses, then choose a test most likely to distinguish them – a test that will probably give different results depending on which is true.”

In the entrepreneurial world, the concept of immediate idea validation and customer development have been popularized after Eric Reis’s book “The Lean Startup”. Part of the reason this approach of surveying potential customers before starting to build a company works so well is that it avoids assumptions that would lead to an availability cascade, also known as ‘building something nobody wants’.

Had Kamen’s team gone out into the streets and asked regular people if they would be willing to pay $5000 for a strange looking scooter, they might have altered their strategy just a bit.

Leave A Comment

Your email address will not be published. Required fields are marked *