“A waist is a terrible thing to mind.” – David Axelrod
Month: June 2018
Should changing minds change your mind?
What should we do when the experts change their minds?
Last week, KJR talked about NIST changing (or is it “updating”?) its recommendation regarding its longstanding advice to change passwords frequently.
The question of the hour is, does NIST changing its recommendation make it a more trustworthy source of expertise, or less?
The two obvious and most popular answers boil down to:
More worthwhile: I’d rather take advice from someone who’s constantly learning more about their field, than from someone who learned something once and decided that’s all they need to know.
Less worthwhile: Why should I rely on advice that’s constantly changing? I’d rather rely on positions that don’t change with the time of day, phase of the moon, and the sun’s position in the zodiac.
Before continuing down this path on the information security front, let’s explore a better-known subject of ongoing controversy — the role of dietary fat in personal health.
There’s been a lot written on all sides of this question, so much so that it’s easy to figure that with no medical consensus, what the hell, I’m in the mood for a cheeseburger!
Me, I take a different position: I’m in the mood for a cheeseburger! Isn’t that what pills are for?
No, say the skeptics. There’s published research showing that statins don’t provide much medical benefit and, for that matter, that saturated fats aren’t at all toxic.
As my pre-statin LDLs were way out of whack, I have a personal stake in this, and so here are my personal guidelines for making sense of personal health, information security, or pretty much any other highly technical subject:
Ignore the divisive. Divisive language is easy to spot. Phrases like “The x crowd,” with x = a position you disagree with (“The first amendment crowd,” or, adding 1, “The second amendment crowd” are easy examples.
This sort of ridicule might be fun (strike that — it is fun) but it isn’t illuminating. Quite the opposite, it’s one of the many ways of dividing the world into us and them, and defining the “right answer” as the one “we” endorse.
Fools vs the informed vs experts. Fools believe what’s convenient. The informed read widely. Experts read original sources.
Fools … perhaps a better designation would be “the easily fooled” … have made confirmation bias a lifestyle choice. Faced with two opposing points of view they’ll accept without question the one they find agreeable while nitpicking the opposing perspective to death.
Those of us who try to remain informed read widely. We choose sources without obvious and extreme biases; that go beyond quoting experts to explaining the evidence and logic they cite; and that provide links or citations to the original sources they drew on.
Especially, we deliberately counter our own confirmation biases by looking skeptically at any material that tells us what we want to believe.
Experts? They don’t form opinions from secondary sources. They read and evaluate the original works to understand their quality and reliability in detail.
There’s always an expert. Want to believe the earth is flat? There’s an “expert” out there with impressive credentials who will attest to it. Likewise the proposition that cigarettes are good for you, and, for that matter, that Wisconsin has jurisdiction over the moon on the grounds that the moon is made of cheese.
Just because someone is able to cite a lone expert is no reason to accept nonsense … see “confirmation bias,” above.
Preliminary studies are interesting, not definitive. For research purposes, statistical significance at the .05 level is sufficient for publication. But statistically, one in every 20 results significant at that level is due to random chance.
Desire to learn vs fondness for squirrels. Ignoring new ideas and information is a sign of ossification, not expertise. But being distracted by every squirrel — changing metaphors, jumping on every new bandwagon because it’s new and exciting — isn’t all that smart either. Automatic rejection and bandwagoning have a lot in common, especially when the rejection or bandwagon appeals to your … yes, you know what’s coming … confirmation bias.
Ignoring changing conditions. No matter what opinion you hold and what policies you advocate, they’re contextual. Situations change. When they do they make the answers we worked so hard to master wrong.
The world has no shortage of people who refuse to acknowledge change because of this. But relying on answers designed for the world as it used to be leads to the well-known military mistake known as “fighting the last war.”
Except that nobody ever fights the last war. They prepare to fight the last war. That’s why they lose the next war.
These are my guidelines. Use them as you like, but please remember:
I’m no expert.