Corporate values have no value.

It isn’t the values themselves. It’s the printed statements that purport to represent the corporation’s values that are worthless.

An HR Magazine piece titled “Evaluating Values,” (Kathryn Tyler, 4/1/2011), disagrees. It endorses formal values statements, while contending they aren’t sufficient. Companies should also require managers to explain them and, even more important, should “hold employees accountable” for them as part of the performance appraisal process.

It provides an example — Eastern Idaho Regional Medical Center (EIRMC) whose values are: Accountability, “EIRMC and I CARE” (don’t worry about it), Integrity, Respect, Quality, Loyalty, and Enjoyment.

It’s an information-free list. Nothing on it could be anything else, unless you think a company might list values like Keeping your head down, Deviousness, Disrespect, Sloppy work, Apathy, and Surliness instead.

It’s information free because it provides no guidance for making difficult choices.

Take quality. Here’s EIRMC’s definition:

Anticipates the needs of those served; craves new knowledge and new experience; delivers very best every day such that work makes a difference; when identifies a problem, also identifies potential solutions; constantly looks for ways to turn “good enough” into “even better.”

EIRMC is a hospital, so let’s look at a real-world challenge hospitals are facing right now: Anesthesiologists and nurse anesthetists diverting anesthetics from their patients for their own use. For example, here in Minnesota, Sarah May Casareto, a nurse anesthetist, allegedly stole a patient’s Fentanyl to feed her dependency, leaving him under-anesthetized for his kidney-stone surgery. (I say allegedly because she entered an “Alford plea,” which means she claims innocence, agrees the evidence is sufficient to convict, and after three years of probation has her criminal record wiped clean.)

The surgeon removed the kidney stones anyway, while a technician held the screaming, writhing patient down. After the surgery, the technician, who spotted two syringes in Casareto’s pocket and, along with the surgeon, noticed erratic behavior on Casareto’s part, reported her, leading to her eventual termination.

All hospitals are vulnerable to impaired care-givers, EIRMC included. And yet, its definition of “quality” doesn’t even hint that surgery patients shouldn’t writhe and scream. Like most Values Statements I’ve seen, it covers feel-good topics while ignoring what matters most — in this case, that any employee spotting a serious lapse in care should escalate the matter immediately, to whatever extent necessary, with full protection from consequences and without regard to whose name is on the problem. (The plaintext version: If an employee notices that a surgeon is drunk, that employee should do something about it, without fear of repercussion.)

We in IT rarely face situations this potentially grave. If a systems administrator arrives for work with a crippling hangover, no patient will have the wrong kidney removed because of it. That doesn’t let you off the hook. Even in IT, staff can find themselves in ethically questionable situations.

There was, for example, the company that instructed programmers to write one-time patch programs that posted $1 billion a month in unaudited transactions.

No, not illegitimate. Unaudited. This was a relatively high-integrity firm as these things go. A series of decisions, made over a ten year span, each of which seemed to make sense at the time, are what made this improvisation necessary.

The assignment wasn’t illegal. It was the response to a difficult situation in which fraud played no part. Refusing it would have been career-limiting at best. But several years later, the company restated its balance sheet to the tune of several billion dollars.

To be fair, I don’t know whether this monthly practice contributed to the problem. Probably, no one does. What I do know is that the company did have a formal values statement, for all the good it did, which was none. It didn’t help the developers make the right decision, because it provided no guidance as to how the company defined “right” in circumstances like this.

Want to run an organization with strong values? My old employer, Perot Systems, showed me how it’s done. Its leaders created a “gray zone” training program, and every employee was expected to participate. It consisted of realistic, ethically confounding situations. Employees role-played them (yes, role-play — it has its place), and followed with serious, in-depth discussions about what had just happened and what should have happened.

It was a significant investment in establishing values. It did more than make the values clear. More important, it established that to the company’s leaders, values mattered, and they recognized that the right ethical choice isn’t always simple and obvious. We all got the message, for a simple reason: The program was expensive.

When a company puts its money where its mouth is, that tells employees it takes the subject seriously.

Al Gore was right.

Oh, don’t be like that. When a man is right, he’s right whether you like him, hate him, or feel intense apathy about him. A couple of years ago Gore published The Assault on Reason, and every day brings another example of People We’re Supposed to Take Seriously swinging another baseball bat at Reason’s head.

Take, for example, Jonah Lehrer’s “Trials and Errors: Why Science Is Failing Us,” in the January 2012 edition of Wired. Lehrer strings together some high profile examples of scientific theories turning out to be wrong, adds erudite-sounding invocations of philosophers like David Hume, and concludes that all scientific accounts of causation are just empty story-telling.

As an antidote, read the Wikipedia entry on Karl Popper. He’s the father of modern scientific epistemology, but somehow didn’t rate even a mention in Lehrer’s article. What you’ll learn is that in our search for truth, the best we’re able to manage is a collection of ideas nobody has managed to prove false.

That’s what science is — causal “stories” (they’re called theories). Some are new and have been subjected to just a few challenges. Others have been battle-tested over a span of decades or even centuries by large numbers of researchers, some brighter than anyone reading these words (not to mention the person writing them); a few brighter than everyone reading these words put together.

Fail to prove an idea wrong enough different times in enough different ways (fail to falsify it in Popper’s terminology) and scientists start to feel confident they’re on the right track.

Not certain, but confident, because it isn’t a certain process. Even when scientists get something right, “right” can turn out to be a special case of a theory that covers more ground. That’s how it turned out for Newton’s theory of gravity. It’s useful enough when you need to build, say, a building that doesn’t fall down, but not sufficient for something more complicated, like, say, a global positioning system.

Citing science’s limitations is easy, which has led the easily fooled to the foolish conclusion that we should ignore what the scientific method tells us — foolish because no one has offered an alternative that works anywhere near as well, let alone better.

It’s New Year’s Resolutions time and I have one for you. (Yes, I have some for myself too, but suggesting ways for other people to improve themselves is so much more fun …) It’s to foster a culture of honest inquiry … in the business culture you influence (KJR’s proper scope) and also in your social circles and family, if that isn’t too much to ask.

It’s harder than you might think, for two interconnected reasons: (1) All culture change starts with changes to your own behavior; and (2) we’re all wired to reach the wrong conclusion under a scarily wide variety of circumstances. For example:

Did you know that in the United States, the highest incidences of kidney cancer are found in small towns?

It’s true. What conclusion do you draw from this? Probably not that it has to be this way as a matter of pure, random chance, but that’s the actual explanation. Don’t believe me? Here’s another, equally true statement: The lowest incidences of kidney cancer are found in small towns.

The way randomness works is that small samples exhibit more variation than large samples. So large metropolitan areas … big samples of the U.S. population … will all have incidences of kidney cancer very close to the overall national mean. Small towns, each a small sample, will vary more widely, so some will have an incidence much lower than the national mean while others will have an incidence much higher.

Even professional statisticians get this sort of thing wrong if they aren’t on their guard for it, as is documented, along with about a zillion other places our ability to draw the correct conclusion falls by the wayside, by your must-read book for 2012 — Daniel Kahneman’s Thinking, Fast and Slow.

The title comes from the author’s well-tested and thus-far not falsified theory that humans rely on two separate decision-making systems. One is quick, heuristic, error-prone, and so effortless we’re often unaware it’s even operating. The other is slow, requires significant effort and concentration, and takes proper stock of evidence and logic.

Do you want to encourage a culture of honest inquiry? It means engaging the slow, effortful system as much as you can. That requires an environment that provides enough time, and sufficiently few distractions, to allow it.

It will be an uphill battle, but one eminently worth fighting.