ManagementSpeak: While in principle I agree with you …
Translation: Let’s do it my way anyway.
Earl Nolan joins the KJR Club by proving himself a principled observer of the management lexicon.
ManagementSpeak: While in principle I agree with you …
Translation: Let’s do it my way anyway.
Earl Nolan joins the KJR Club by proving himself a principled observer of the management lexicon.
Al Gore was right.
Oh, don’t be like that. When a man is right, he’s right whether you like him, hate him, or feel intense apathy about him. A couple of years ago Gore published The Assault on Reason, and every day brings another example of People We’re Supposed to Take Seriously swinging another baseball bat at Reason’s head.
Take, for example, Jonah Lehrer’s “Trials and Errors: Why Science Is Failing Us,” in the January 2012 edition of Wired. Lehrer strings together some high profile examples of scientific theories turning out to be wrong, adds erudite-sounding invocations of philosophers like David Hume, and concludes that all scientific accounts of causation are just empty story-telling.
As an antidote, read the Wikipedia entry on Karl Popper. He’s the father of modern scientific epistemology, but somehow didn’t rate even a mention in Lehrer’s article. What you’ll learn is that in our search for truth, the best we’re able to manage is a collection of ideas nobody has managed to prove false.
That’s what science is — causal “stories” (they’re called theories). Some are new and have been subjected to just a few challenges. Others have been battle-tested over a span of decades or even centuries by large numbers of researchers, some brighter than anyone reading these words (not to mention the person writing them); a few brighter than everyone reading these words put together.
Fail to prove an idea wrong enough different times in enough different ways (fail to falsify it in Popper’s terminology) and scientists start to feel confident they’re on the right track.
Not certain, but confident, because it isn’t a certain process. Even when scientists get something right, “right” can turn out to be a special case of a theory that covers more ground. That’s how it turned out for Newton’s theory of gravity. It’s useful enough when you need to build, say, a building that doesn’t fall down, but not sufficient for something more complicated, like, say, a global positioning system.
Citing science’s limitations is easy, which has led the easily fooled to the foolish conclusion that we should ignore what the scientific method tells us — foolish because no one has offered an alternative that works anywhere near as well, let alone better.
It’s New Year’s Resolutions time and I have one for you. (Yes, I have some for myself too, but suggesting ways for other people to improve themselves is so much more fun …) It’s to foster a culture of honest inquiry … in the business culture you influence (KJR’s proper scope) and also in your social circles and family, if that isn’t too much to ask.
It’s harder than you might think, for two interconnected reasons: (1) All culture change starts with changes to your own behavior; and (2) we’re all wired to reach the wrong conclusion under a scarily wide variety of circumstances. For example:
Did you know that in the United States, the highest incidences of kidney cancer are found in small towns?
It’s true. What conclusion do you draw from this? Probably not that it has to be this way as a matter of pure, random chance, but that’s the actual explanation. Don’t believe me? Here’s another, equally true statement: The lowest incidences of kidney cancer are found in small towns.
The way randomness works is that small samples exhibit more variation than large samples. So large metropolitan areas … big samples of the U.S. population … will all have incidences of kidney cancer very close to the overall national mean. Small towns, each a small sample, will vary more widely, so some will have an incidence much lower than the national mean while others will have an incidence much higher.
Even professional statisticians get this sort of thing wrong if they aren’t on their guard for it, as is documented, along with about a zillion other places our ability to draw the correct conclusion falls by the wayside, by your must-read book for 2012 — Daniel Kahneman’s Thinking, Fast and Slow.
The title comes from the author’s well-tested and thus-far not falsified theory that humans rely on two separate decision-making systems. One is quick, heuristic, error-prone, and so effortless we’re often unaware it’s even operating. The other is slow, requires significant effort and concentration, and takes proper stock of evidence and logic.
Do you want to encourage a culture of honest inquiry? It means engaging the slow, effortful system as much as you can. That requires an environment that provides enough time, and sufficiently few distractions, to allow it.
It will be an uphill battle, but one eminently worth fighting.