HomeIndustry Commentary

Evidence-based decision-making’s limitations

Like Tweet Pin it Share Share Email

Evidence has its limits.

Regular readers, good friends, casual acquaintances and just about anyone else I can trick into a conversation on the subject know I’m a strong proponent of evidence-based decision-making. Most of the time, compared to most of the alternatives, evidence should be a tool of choice when making an important decision.

But … (please don’t snicker) it’s a big but … evidence does have limitations. It isn’t always the right tool for the job:

  • Evidence here but not there: Sometimes you have to choose among alternatives when reliable evidence is only available for one of them.

Insistence on evidence is why large enterprises so often favor cost-cutting over revenue enhancement. Cut the cost of a business process and it’s [relatively] easy to follow the savings to the bottom line. Invest in revenue enhancement, on the other hand, and there’s rarely a reliable and trustworthy way to provably connect investment and results.

This is one of the biggest challenges with the M in so-called SMART goals (specific, measurable, achievable, realistic, and time-bound). SMART causes organizations to prefer goals that are measurable … for which it’s possible to collect numerical evidence … over goals that are important.

  • Of course that’s Gus: As Daniel Kahneman explained in his don’t-even-think-about-not-reading-it book, Thinking, Fast and Slow, when you run into your friend Gus, you don’t need evidence that it’s Gus. Unless you’re a schmuck or trapped in a horror movie, you see his face and that’s that.

It’s people who don’t know Gus who might ask to see his driver’s license.

  • The future: Those who plan for the future have definitely chosen the best period of time to plan for. Planning for the past is way too late; meanwhile, the present turns into the past before you’ve finished planning for it.

When you plan (for the future), evidence should be something you take into account, but cautiously. Your evidence is about the past, after all. When the future turns out to be like the past only more so, evidence is just the ticket. When the future turns out some other way, the evidence will have pointed you in the wrong direction.

Will the future look like the past? Good luck finding evidence to help you figure that out.

  • Confirmation bias: This is the formal term for when they accept without question evidence that supports what they want to be true while nitpicking to death evidence that supports your position.

They’re using evidence for ammunition, that is, not for illumination.

What’s harder is knowing when we are they. Here’s one clue: If you find yourself memorizing a point so as to win a future argument with one of them, you’re probably succumbing to confirmation bias. More to the point: If you’re reading about a subject and your goal isn’t to understand it more thoroughly, you’re turning into one of them.

  • Citing other people’s opinions: In the Internet age, someone else’s opinion often counts as evidence, and if not, someone else’s opinion about someone else’s opinion does.

An expert’s opinion is useful, when it’s based on original research or research the expert has carefully reviewed.

But often a “consensus of the experts” is little more than aggregating a bunch of dumb looks. The averaged opinion of people who are experts in other subjects isn’t evidence. It’s just what a bunch of folks who know something about something else think.

  • Survey monkeys: Some surveys provide useful evidence. The rest just tell you what a bunch of anonymous respondents say they think about a subject.

We’re talking about people whose qualifications you don’t know and most likely the surveyors don’t know either, other than the “qualification” that they’re willing to take the time to respond to a survey.

Even for honest practitioners, opinion research is a complex field fraught with evidentiary landmines. And they’re a vanishing breed compared to a growing population of push-pollers who do everything possible to get the results their clients have asked for. So accept survey results with caution.

Are these cautionary limitations enough to persuade you evidence-based decision-making is a bad idea? I hope not, because they shouldn’t be. When evidence that passes these tests is available, take maximum advantage of it when making an important decision.

But don’t pretend when it isn’t. Sometimes the best you can do is make explicit assumptions and apply careful logic to them. When that’s the situation, do your best with it.

And don’t worry that it’s the best you can do.

Comments (8)

  • Sometimes, you many not know why something is right, but it is. 24 years ago, I got hired for my first IT programming job by a manager who very much believed in merit hiring and I had come in second on the programming test he gave the 3 finalists. To this day, he says he says he still doesn’t know why he picked me.

    Yet, I’d like to think it turned out very well for both of us and the companies we worked for. It turns out that we had great synergy and were a very productive team.

    Looking back at things, I’d say we had very similar values, but complementary work styles and temperaments, such that we were very honest with each other, but rarely rubbed each other the wrong way. But I don’t really know how you would collect evidence for these qualities.

  • Excellent point about the M in SMART goals!

  • All good points and more support for that old saw: “lies, damned lies, and statistics”. Kind of makes one wonder how a quality decision ever gets made…or is “quality decision” an oxymoron.

    Hmmm…

    Perhaps this is where one considers past accomplishments and experiences more important than a flood of facts. Maybe this is one of those cases where past performance IS indicative of future trends?

  • Bob:
    Very well done. This is a “Prime Rib” article that needs to be read and understood by today’s tethered society. Most of them have turned up the volume, but can’t see or hear the signal through the noise.

    Empirical data rules . . . again, well done and well said.

  • If you do not have any evidence, or you do not have sufficient or adequate evidence, it is appropriate to expend (invest?) significant energy into gathering evidence.

  • Nice job. Another point — sometimes a “consensus of the experts” is merely one expert’s opinion repeated by all the others.

  • @Michal: That’s basically a tautological statement. Don’t make a decision until you know enough to make a decision? Well .. yeah. But the key is what counts as “enough”.

    Decision-making in the face of uncertainty (that’s pretty much every real-world action) requires an assessment of “coherence”. Coherence is a measure of the range of potential outcomes possible from a particular decision. Outcomes from the decision between eating a ham or chicken sandwich have high coherence — you shouldn’t worry about performing a detailed evidence-based study on relative enjoyment of sandwiches for white males aged 35-49.

    On the other hand, if you’re making a passenger airliner: yeah, you’ll want some good evidence on whether it will keep flying or not before you put it into the sky…

    If you can reasonably establish a worst-case scenario, then sometimes that’s the only evidence you need.

  • Albert Einstein on metrics: “Not everything that can be counted counts, and not everything that counts can be counted.”

Comments are closed.