Politics, according to Larry Hardiman, whoever he is, comes from the Greek “poly,” meaning many, and “tick,” meaning blood-sucking parasite.

It’s hardly a fair characterization. Politics is the art of finding a path forward for a collection of people who disagree … often strongly … about even such matters as where forward is.

But (news flash!) politics can get ugly. One of the many ways it gets ugly is politicizing topics that aren’t intrinsically political.

Take, for example, information security (you thought I was going to dive into climate change or evolution by natural selection, didn’t you?).

In the unpoliticized world, information security is (to oversimplify things more than just a bit) a matter of making rational choices about protecting an organization’s data and applications portfolio from intruders wanting to steal the former or alter the latter.

What makes these decisions interesting is that with very few exceptions, every additional increment of protection raises not just the direct cost of security, but also barriers that impede the flow of work, making an organization just that much less nimble than it would otherwise be.

Information security should be, that is, a collection of deliberately chosen trade-offs between risk and cost on one side and effectiveness on the other.

It’s how information security works in companies where the need to avoid blame hasn’t irretrievably politicized it. Which is to say, out of the 100,000 or so U.S. businesses with 100 or more employees, it’s how roughly 142 practice the discipline. For all the remaining 99,858 businesses, information security is a politicized mess. (I arrived at this number by typing three keys on the numeric keypad with my eyes closed. I challenge you to arrive at a more accurate estimate.)

Because the driving force is blame avoidance, the way it plays out is that instead of making trade-offs between cost and risk on the one side and running an effective business on the other, InfoSec goes into full prevent mode: Plug every hole, address every risk, and, most of all, require every password to be at least 42 characters long and chockfull of punctuation marks, numbers, and both capital and lowercase letters, measuring password strength by memorizability: If an employee can remember a password, it isn’t strong enough.

All of which leads to frustration. I’m not referring to the frustration business users feel, although that is generally intense. I’m referring to the frustration InfoSec experiences because it never has enough budget or authority to prevent all possible mishaps from occurring, even though it will always be blamed for anything that goes wrong.

And things will go wrong, not only in spite of InfoSec’s efforts, but also because of them. The reason is simple and predictable: Raise barriers enough, and employees stop seeing them as protections they should respect and start seeing them as impediments they should work around. Writing passwords on Post-It notes is just the most visible example. DropBox, jump drives, email attachments and all the other ways employees manage to take files with them so they can get work done are probably more significant.

It would all work much better if InfoSec collaborated with employees to find secure ways to do what they need to do … maybe not perfectly secure, but secure enough; certainly more secure than the work-arounds.

But InfoSec can’t, because when blame-avoidance is the primary goal, secure-enough will never be secure enough. Far better to experience a breach and be able to say, “They violated our password policy,” than to have to respond to an outside security audit that reports the existence of a theoretical vulnerability in a solution instituted deliberately by InfoSec.

The point of this column isn’t limited to information security … a subject about which I know just barely enough to raise the points made above.

No, the point is the hidden costs of a culture of blame. They’re enormous.

When a company has a culture of blame, employees expend quite a lot of their time and energy, and cost center managers expend quite a lot of their budget, time, and energy, doing whatever they can to make sure the Finger That Points points at someone else if something goes wrong.

A small part of that time, energy, and money goes into making the business more effective. Far more goes to CYGM (cover your gluteus maximus) activities that do little other than to provide documentation that whatever went wrong is Someone Else’s Fault.

It’s a problem worth solving. How? That will have to wait until next week.

Al Gore was right.

Oh, don’t be like that. When a man is right, he’s right whether you like him, hate him, or feel intense apathy about him. A couple of years ago Gore published The Assault on Reason, and every day brings another example of People We’re Supposed to Take Seriously swinging another baseball bat at Reason’s head.

Take, for example, Jonah Lehrer’s “Trials and Errors: Why Science Is Failing Us,” in the January 2012 edition of Wired. Lehrer strings together some high profile examples of scientific theories turning out to be wrong, adds erudite-sounding invocations of philosophers like David Hume, and concludes that all scientific accounts of causation are just empty story-telling.

As an antidote, read the Wikipedia entry on Karl Popper. He’s the father of modern scientific epistemology, but somehow didn’t rate even a mention in Lehrer’s article. What you’ll learn is that in our search for truth, the best we’re able to manage is a collection of ideas nobody has managed to prove false.

That’s what science is — causal “stories” (they’re called theories). Some are new and have been subjected to just a few challenges. Others have been battle-tested over a span of decades or even centuries by large numbers of researchers, some brighter than anyone reading these words (not to mention the person writing them); a few brighter than everyone reading these words put together.

Fail to prove an idea wrong enough different times in enough different ways (fail to falsify it in Popper’s terminology) and scientists start to feel confident they’re on the right track.

Not certain, but confident, because it isn’t a certain process. Even when scientists get something right, “right” can turn out to be a special case of a theory that covers more ground. That’s how it turned out for Newton’s theory of gravity. It’s useful enough when you need to build, say, a building that doesn’t fall down, but not sufficient for something more complicated, like, say, a global positioning system.

Citing science’s limitations is easy, which has led the easily fooled to the foolish conclusion that we should ignore what the scientific method tells us — foolish because no one has offered an alternative that works anywhere near as well, let alone better.

It’s New Year’s Resolutions time and I have one for you. (Yes, I have some for myself too, but suggesting ways for other people to improve themselves is so much more fun …) It’s to foster a culture of honest inquiry … in the business culture you influence (KJR’s proper scope) and also in your social circles and family, if that isn’t too much to ask.

It’s harder than you might think, for two interconnected reasons: (1) All culture change starts with changes to your own behavior; and (2) we’re all wired to reach the wrong conclusion under a scarily wide variety of circumstances. For example:

Did you know that in the United States, the highest incidences of kidney cancer are found in small towns?

It’s true. What conclusion do you draw from this? Probably not that it has to be this way as a matter of pure, random chance, but that’s the actual explanation. Don’t believe me? Here’s another, equally true statement: The lowest incidences of kidney cancer are found in small towns.

The way randomness works is that small samples exhibit more variation than large samples. So large metropolitan areas … big samples of the U.S. population … will all have incidences of kidney cancer very close to the overall national mean. Small towns, each a small sample, will vary more widely, so some will have an incidence much lower than the national mean while others will have an incidence much higher.

Even professional statisticians get this sort of thing wrong if they aren’t on their guard for it, as is documented, along with about a zillion other places our ability to draw the correct conclusion falls by the wayside, by your must-read book for 2012 — Daniel Kahneman’s Thinking, Fast and Slow.

The title comes from the author’s well-tested and thus-far not falsified theory that humans rely on two separate decision-making systems. One is quick, heuristic, error-prone, and so effortless we’re often unaware it’s even operating. The other is slow, requires significant effort and concentration, and takes proper stock of evidence and logic.

Do you want to encourage a culture of honest inquiry? It means engaging the slow, effortful system as much as you can. That requires an environment that provides enough time, and sufficiently few distractions, to allow it.

It will be an uphill battle, but one eminently worth fighting.