“When we ask advice, we are usually looking for an accomplice.” – Marquis de la Grange
Month: January 2012
Think fast! Even better, don’t.
From Daniel Kahneman’s Thinking, Fast and Slow:
“The following is a personality sketch of Tom W, written during Tom’s senior year of high school by a psychologist, on the basis of psychological tests of uncertain validity:
“Tom W is of high intelligence, although lacking in true creativity. He has a need for order and clarity, and for neat and tidy systems in which every detail finds its appropriate place. His writing is rather dull and mechanical, occasionally enlivened by somewhat corny puns and flashes of imagination of the sci-fi type. He has a strong drive for competence. He seems to have little feel and little sympathy for other people, and does not enjoy interacting with others. Self-centered, he nonetheless has a deep moral sense.”
Rank the following programs in order of the likelihood Tom W is enrolled as a graduate student there:
- Business administration
- Computer science
- Engineering
- Humanities and education
- Law
- Medicine
- Library science
- Physical and life sciences
- Social science and social work
Done? Now rank the same fields in order of popularity — in order, that is, of how many graduate students you’d expect to be enrolled in each of them.
If you’re like most people, your ranked computer science and engineering as Tom W’s most likely fields, even though you probably figured business administration and humanities and education are more popular overall.
It’s a beautiful example of just how easily we fool ourselves (me too when I went through the exercise). We know Tom W’s psychological profile is unreliable because this was explained to us before we read it. Even so, our reliance on stereotypes created a consistent narrative that overpowered our knowledge of basic statistics: If there are ten MBA enrollees for every grad student in computer science, the odds would be ten times higher that this is where we’d find Tom.
They’d still be higher even if Tom’s psychological profile was accurate. When choosing a graduate program, he, like everyone else, would take into account more than his stereotypical personality traits … his aptitudes, what he enjoys doing, and where he would expect the best career, for example.
I’ve spoken to quite a few groups by now on the importance of a culture of honest inquiry and the hazards of “intellectual relativism” and trusting your gut. My emphasis has been on factors that allow others to manipulate us, like:
- Tribalism – our tendency to divide the world into “us” and “them,” which encourages us to discount and disparage anything “they” have to say.
- Anger and fear, which make us stupid and gullible.
- Hyperlinks (aka footnotes), which work like this: If I say, in KJR, that you should accept it because I say it’s true, you’ll likely be skeptical and evaluate it carefully. If, on the other hand, I say something is true and you should accept it because this other person, who’s an expert in the subject, says so (the hyperlink), you’ll be far more likely to accept it at face value, even though you have no more reason to accept my assertion that the other feller is an expert than you had to accept my original proposition.
In any event, having read it, it’s clear there’s a lot more to creating a culture of honest inquiry than avoiding overt manipulation. Here’s another obstacle, which you can demonstrate at your next all-hands meeting.
Divide the participants into two groups. Set up a flip chart for each of them, so neither group can see what the other is looking at. On the first group’s flip chart, write “25,000” in the middle of the page. On the second group’s chart, write “125,000.”
Give everyone a piece of paper and tell them to first write “yes” if they think the average salary in your department is higher than the number on the chart, “no” if they think it’s lower. Then tell them to write their best guess as to the actual average salary, without comparing notes.
Assuming you have enough employees in each group to provide a reasonable statistical sample, it’s just about certain that the first group’s average estimate will be much lower than what the second group came up with. The effect is called “priming” and it’s remarkably difficult to overcome.
There’s an irony here: Many KJR subscribers will take my word for what Thinking, Fast and Slow has to say, instead of verifying it for themselves. It’s footnoting in action.
None of us has much choice. We’re all too busy to read everything we should, and so we rely on others to summarize for us. That’s okay.
Just make sure those you rely on are reliable, and not someone you trust because they’re a member of the right tribe.