HomeCognitive Enterprise


Like Tweet Pin it Share Share Email

Pop quiz!

Question #1: In the past 20 years, the proportion of the world population living in extreme poverty has (A) almost doubled; (B) Remained more or less the same; (C) almost halved.

Question #2: Worldwide, 30-year-old men have spent 10 years in school. How many years have women of the same age spent in school? (A) 9 years; (B) 6 years; (C) 3 years.

The correct answers are C and A. If you got them wrong, you have a lot of company. Across a wide variety of groups worldwide, faced with these and many more questions with factual answers, people do far worse than they would by choosing responses at random.

Which brings us to the next addition to your KJR bookshelf: Factfulness: Ten Reasons We’re Wrong About the World — and Why Things are Better Than You Think (Hans Rosling with Ola Rosling and Anna Rosling Rönnlund, Flatiron Books 2018). Unlike books that rely on cognitive science to explain why we’re all so illogical so often, Rosling focuses on the how of it. Factfulness is about the mistakes we make when data are available to guide us but, for one reason or another, we don’t consult it to form our opinions. Viewed through this lens, it appears we’re all prone to these ten bad mental habits:

  1. Gaps: We expect to find chasms separating one group from another. Most of the time the data show a continuum. Our category boundaries are arbitrary.
  2. Negativity: We expect news, and especially trends, to be bad.
  3. Extrapolation: We expect trend lines to be straight. Most real-world trends are S-shaped, asymptotic, or exponential.
  4. Fear: What we’re afraid of and what the most important risks actually are often don’t line up.
  5. Size: We often fall for numbers that seem alarmingly big or small, but for which we’re given no scale. Especially, we fall for quantities that are better expressed as ratios.
  6. Generalization: We often use categories to inappropriately lump unlike things together and fail to lump like things together. Likewise we use them to imagine an anecdote or individual is representative of a category we more or less arbitrarily assign them to when it’s just as reasonable to consider them to be members of an entirely different group.
  7. Destiny: It’s easy to think people are in the circumstances they’re in because it’s inevitable. In KJR-land we’ve called this the Assumption of the Present.
  8. Single Perspective: Beware the hammer and nail error, although right-thinking KJR members know the correct formulation is “If all you have are thumbs, every hammer looks like a problem.” Roslund’s advice: Make sure you have a toolbox, not just one tool.
  9. Blame: For most people, most of the time, assigning it is our favorite form of root-cause analysis.
  10. Urgency: The sales rep’s favorite. In most situations we have time to think, if we’d only have the presence of mind to use it. While analysis paralysis can certainly be deadly, mistaking reasonable due diligence for analysis paralysis is at least as problematic.

The book certainly isn’t perfect. There were times that, adopting my Mr. Yeahbut persona, I wanted to strangle the author, or at least have the opportunity for a heated argument. Example:

Question #3: In 1996, tigers, giant pandas, and black rhinos were all listed as endangered. How many of these three species are more critically endangered today? (A) Two of them; (B) One of them; (C) None of them.

The answer is C — none are more critically endangered, which might lead an unwary reader to conclude we’re making progress on mass species extinction. It made me wonder why Roslund chose these three species and not, say, Hawksbill sea turtles, Sumatran orangutans, and African elephants, all of which are more endangered than they were twenty years ago.

Yeahbut, this seems like a deliberate generalization error to me, especially as, in contrast to the book’s many data-supported trends, it provides no species loss trend analysis.

But enough griping. Factfulness is worth reading just because it’s interesting, and surprisingly engaging given how hard it is to write about statistical trends without a soporific result.

It’s also illustrates well why big data, analytics, and business intelligence matter, providing cautionary tales of the mistakes we make when we don’t rely on data to inform our opinions.

I’ll finish with a Factfulness suggestion that would substantially improve our world, if only everyone would adopt it: In the absence of data it’s downright relaxing to not form, let alone express, strongly held opinions.

Not having to listen to them? Even more relaxing.

Comments (12)

  • Looking at question #3: it seems the problem here is related to the problem you’ve often discussed – getting what you measure.

    Seems to me a more relevant and useful set of questions would be: “1. For vertebrates (mammals, birds, reptiles, amphibians, and fishes) that were endangered in X year, how many species are more endangered, less endangered or in about the same status in 2019?” I use vertebrates because I believe there’s more information available about them, but you could expand that to chordates or even to the entire animal kingdom, in theory.

    “2. Of the vertebrate (or chordate, or animal) species known to exist in 1996, how many were endangered then, and how many are endangered or extinct today?”

    “3. What percentage of known vertebrates (or chordates, or animals) for each point in time (1996 and 2019) were classified as endangered?”

    All of those tell you a lot more about the status of endangered species than simply measuring three chosen examples.

  • Factfulness takes all the fun out of making assumptions. LOL

  • Great antidote to “fake thinking”. Good stuff.

  • “In the absence of data it’s downright relaxing to not form, let alone express, strongly held opinions.

    Not having to listen to them? Even more relaxing.”

    In my current situation, the LAST thing I need is relaxation. What I really need is frenzied activity — the more useless and wasteful the better!

    In line with that oh-so-useful syllogism… “Something MUST be done, THIS is SOMETHING, therefore THIS must be done”…

    What I really need right now is lots and lots of exercise in the form of LEAPING TO CONCLUSIONS! Up! And down! And up! And down! Wheeeee!

    And NOT forming opinions, in the absence of data to inform them, would interfere with this goal (and it certainly isn’t a mere STRATEGY towards a goal; wastefulness is the actual goal itself).

    /s (on the off chance anybody was wondering)

  • Factoid is another group of errors.

    Read the Washington Post today and meteorologists were complaining about how folks don’t even know what county they are in and “thereby putting themselves in danger from tornados.” They referred to a study showing 85% of Americans could not place themselves within 50 miles on a map of just State lines and no other identifications such as roads or rivers.

    That makes us seem dumb.

    1. What percentage of Americans in the 1920s or 1950s could find themselves. Probably the same or fewer.

    2. Even if everybody could find themselves on a map and _knew_ fer sure the bad weather was coming to their county and not somebody else’s county, (the problem statement), how many lives would actually be saved? I’m thinking zero.

    But it makes for a good article about how dumb people are and how the “put themselves in danger” from tornados by incorrectly identifying the county they live in.

    • Huh. I have to wonder what value a map with no lines and labels has in the first place. On the other hand, knowing what county you live in could be important, as severe weather warnings are usually reported on a county-by-county basis.

      • The article did say the warnings are given out county by county.
        So what? A county is huge. If I’m at home, I don’t go out if the weather looks nasty. If it doesn’t look nasty, then maybe the tornado is at the other end of the county.

        Assume _everyone_ knows what county he or she is in at the time a warning is issued. How many lives are now saved? I’m still thinking it’s closer to zero than to ten.

        It’s a factoid that implies far more than is warranted.

  • Where on the list is the criteria for news reporting, which is where most of us get our information? Or do all 10 contribute to making the news, which heightens the effect on us news-consumers? Things that stay the same aren’t news. Things that get better are less fearful and urgent than things that are getting worse. Humans are primed to react quickly to potentially harmful events as a survival mechanism.

    On the other hand, humans are also drawn to positive things. I wonder what the sales would be for a news source that emphasized positive news? Would people flock to it as a change and relief from the negativity? Or would they think that it is mostly not relevant to their lives?

    In response to the article’s first question, I knew that global poverty rates were down. I also know that income inequality in the US has increased in the last 20-40 years (I don’t know about absolute poverty rates in the US). Which is more relevant to me: global poverty or my country’s (current) trend to income inequality? The less an issue immediately concerns me, the less I am likely to know any facts about it. “Relevance” isn’t one of the the factors. Possibly because that doesn’t count as “bad” mental habit?

    • Roslund devotes quite a lot of space to “the media,” but more a rant than an analysis. His primary complaint seems to be that the news media report the news … events that are interesting and out of the ordinary … which means the news doesn’t report long-term trends but does report events that run counter to the trend.

      I didn’t mention this on the grounds that critiquing his critique of the media would have taken more space than I could devote to it in KJR’s admittedly arbitrary length limit.

      But for whatever it’s worth, it seems to me that complaining about the news media failing to cover subjects-that-aren’t-news is a bit like complaining that a dog can’t pull a plow.

  • I love so many things about Hans Roslund, including my favorite TED talk by him about the magic washing machine.

    It’s not directly applicable to the IT world but I find that his approach helps a lot when talking to people about climate change, and the tools we have to fix (at least some) of it. The big one is nuclear energy, but even that is just one tool in the box. But so many people want to leave it out because “what about the waste”, “its so expensive” or “it’s too risky/dangerous”, all of which are prime examples if opinions that are made without the help of the (plentiful) applicable data.

    Also, this. https://twitter.com/prchovanec/status/718886774152540161

    • I’m not against nuclear power as one of the tools in the toolbox. I am concerned that its proponents, and I include Roslund in this, fail to understand a core principle of risk management, which is that successful prevention is indistinguishable from absence of risk. Nuclear power is safe precisely because of the very public concerns about it.

      I’d say the problems forecast for inadequate nuclear waste storage strategies also fall into this category: They’re solvable but not solved because the potential damage is gradual and cumulative, but the measures needed for prevention require funding right now.

Comments are closed.