I might owe Tom Peters an apology.

Last week I mentioned a Fast Company article in which Peters said he’d faked his data. According to a follow-up in Business Week shared by reader Ed Kimball, it appears Peters (and invisible co-author Robert Waterman) did nothing of the kind.

Peters reviewed and approved the Fast Company article, but now claims Fast Company’s writer, Alan Webber, invented the quote. Webber politely and obliquely disagreed. So either Tom Peters faked faking his data, or Fast Company faked his faking it for him.

Reliable evidence, and relying on evidence, is vital to making smart decisions in business. In Good to Great, Jim Collins quotes Lyle Everingham, CEO of Kroger during its transition from muddling through to 25 years of outstanding performance: “Once we looked at the facts, there was really no question about what we had to do.” A&P, its lackluster competitor, pretended. It created a new store concept, The Golden Key, to test ideas. Its executives didn’t like what its evidence told them so they closed it.

Kroger had a culture of honest inquiry — where executives, managers and employees do their best to use trustworthy evidence to drive decision-making. Creating a culture like this takes work, persistence, and sometimes political dexterity. Here are some specific measures you can take to foster a culture of honest inquiry in your workplace:

  • It starts with wanting to know what’s really going on out there. Enron and WorldCom happened, in part, because their executives were so busy trying to make their companies look good that they obscured what was really going on to themselves. Your dashboards, financial reports, and other forms of organizational listening are to make you smarter. If not, don’t bother.
  • Confidence comes from doubt. Certainty, in contrast, comes from arrogance. If an employee is confident and can explain why, wonderful. If that employee’s certainty pre-empts everyone else’s ability to make their case, the employee is on the wrong side of things.
  • Start every decision by creating a decision process. You don’t have to be in charge to encourage this habit. Just ask the question, “How will we make this decision?” That changes the discussion away from who wins to how to create confidence in the outcome. The results: A better decision, a stronger consensus, and a few more employees who see the benefit of honest inquiry.
  • Don’t create disincentives for honesty. If you ask for honest data, and use it to “hold people accountable,” you won’t get honest data. Why would you? The superior alternative is to employ people who take responsibility without external enforcement, and to create incentives for that kind of behavior. This works much better, and takes less effort.
  • The “view from 50,000 feet” is for illustration, not persuasion. A high-level strategic view is essential for focusing the efforts of the organization. High-level logic, in contrast, is oxymoronic: Detailed evidence and analysis is what determines whether the high-level view makes sense, or just looks good in the PowerPoint.
  • Evidence too far removed from the original source is suspect. Don’t trust summaries of summaries of summaries, especially if they tell you what you want to hear. Even with the best of intentions the game of telephone is in play. And many of those on the side of intellectual relativism don’t have the best of intentions.
  • Be skeptical of those with a financial stake in the decision. But don’t ignore them. A conflict of interest suggests bias, but doesn’t automatically make someone wrong. Be wary and dig into their evidence, especially if their evidence is a summary of a summary of a summary. But if you demonstrate to your satisfaction that they’ve cooked the evidence … go ahead and ignore them from now on. They’ve earned it.
  • Beware of anecdotes and metaphors. They’re useful … for illustrating a point, or for demonstrating that something is possible. For anything else you need statistically valid evidence. Yes, Disraeli said there are three types of lie. He miscounted — argument by anecdote is far more pernicious than argument by statistics, and argument by metaphor is even worse. Yes — you do have to understand statistics well enough to evaluate the evidence. Sorry. That’s part of your toolkit.

If you work in a business culture infected with intellectual relativism, it will take time and patience to build the habit of rationalism. You won’t do so by preaching and lecturing about the general principle.

The way to build a culture of honest inquiry is one decision at a time.

Fast Company interviewed Tom Peters on the twentieth anniversary of In Search of Excellence. In the interview, Peters casually mentioned that he’d faked his data. And neither he nor anyone else thought anything of it!

Smart managers rely on evidence to evaluate the validity of their ideas but as Peters’ remark illustrates, they face a serious disadvantage when compared to the scientific community. To avoid intellectual relativism — a mindset that considers all ideas equally valid — you need trustworthy evidence. Scientists understand this. It’s why articles submitted to scientific journals receive peer review, and no research is fully accepted until confirmed independently. It’s also why discussions among scientists are blunt, sloppy thinking is publicly derided, and anyone caught faking data is expelled from the profession. Scientists require integrity the way professional golfers require good manners.

The sources business people rely on, external and internal, are intrinsically less reliable than the peer-reviewed journals scientists read, and they’re getting worse as intellectual relativism takes increasing hold. Here’s another example:

In the olden days we had access to independent test labs, and trade publications routinely published product performance comparisons. We no longer do, because technology vendors prohibit publication of performance data in their EULAs. This might be so vendors can control what we’re told about their products. It also might be that vendors are concerned that “independent” testing labs and IT research firms have been paid to reach a predefined conclusion.

The concern isn’t unreasonable. Quite a few vendors have complained to me privately about “shakedowns” by well-known IT research firms. Nobody will go on the record — for fear of retaliation, they say, although sour grapes is another possible explanation. The closest I’ve come to direct evidence is a marketing director who happily burbled, off the record, that in exchange for her buying services from one of them, they would “… help us with our marketing efforts.”

The information these firms sell might very well be excellent, but you have no way to evaluate it, or them. The raw data is confidential, their processes are opaque and un-audited, and potential conflicts of interest are significant.

This is what happens when no culture of honest inquiry exists and no process exists to enforce honest research: You lose your ability to trust the evidence. That’s true when it comes to the big IT research firms. It’s equally true when it comes to the sources of information you rely on inside your company.

In small businesses, much of your information comes from direct observation. You know what happened (although even then, interpreting what you see isn’t always straightforward). The bigger the organization, the more you learn indirectly and in summarized form. In very large companies, figuring out what’s actually going on out there is close to impossible.

Many leaders respond by limiting their information sources. They might trust one or two executives in the chain of command. Their administrative assistants might share what’s in the gossip mill. They might rely on a weekly production report that provides “key performance indicators.” But the number of sources is always small, because every additional source of information adds confusion, not clarity. This is so much a way of life for many executives that they miss the thoroughly obvious implication. Which is:

Every additional source of information should improve your understanding, adding depth, color and detail. If that isn’t the case, your organization is guided, not by a culture of honest inquiry, but by the selling of personal agendas. Managers sift through evidence searching for ammunition, instead of letting the evidence tell the story. It’s a hallmark of intellectual relativism — starting with the “right” position on an issue, then accepting and discarding evidence depending on whether it supports that position.

Fixing a culture like this is neither quick nor easy. To start, broaden your range of sources. Meet with a lot of people. Open your door. Convene “user groups.” Invite groups of employees to luncheon roundtables. Start collecting performance measures (yes, metrics are a listening channel — one way of finding out what’s going on out there among many).

Initially, you will become increasingly confused. Stick with it. As you continue to listen to more and more sources a picture will come into focus.

It comes down to this: Some leaders want to find out what’s really going on out there. They want high-quality evidence that tells an accurate, nuanced (if you’ll forgive the over-used word) story. Other leaders start by telling the story and insist you give them evidence that proves them right.

Both get what they ask for.