Among the many books I’d like to write but probably won’t, Selling Strong Engineering To Executives is high on the list. I probably won’t write it because even the best techniques aren’t reliable enough to be worth codifying. But you have to try.

Regular correspondent Jon Lee pointed out the relevance of the Challenger disaster to this topic. Richard Feynman, the Nobel Prize-winning physicist, was on the Rogers Commission that investigated the tragedy. His personal observations are worth reading in their entirety. A very short version goes like this: NASA’s engineers assessed flight risks to be 1,000 times greater than the managers who decided whether missions should fly.

I’ve coined a term for this mathematical transformation: Direct Reduction In the Value of Engineering Likelihoods (DRIVEL).

NASA’s managers were, and probably still are under tremendous political pressure to get the shuttle in the air. People being what they are, they persuaded themselves the engineers were exaggerating the risks. They rationalized their DRIVEL with this “logic”: “It hasn’t happened yet.”

Publicly held corporations justify bad decisions with DRIVEL too, and they do it all the time. Every professional project manager knows this, because every one has been put in charge of at least one project whose launch has been delayed due to executive dithering but whose deadline, scope and budget remain fixed.

If it’s an internal project, the executives rationalize the DRIVEL by telling themselves the estimators padded the original schedule (guaranteeing, of course, that they’ll pad the next one). If it’s an outsourced project, they rationalize it by telling themselves it’s someone else’s problem — the company contracting to perform the work.

The contracting company applies similar DRIVEL to its computations, because it gets the revenue now. By the time the future gets here anything can happen — everyone in a position to blow the whistle can find a way to get promoted out of harm’s way or hired by a competitor. If the project manager or account manager protests, you know what happens. That’s right — the senior decision maker says (all together now, with feeling): “If you can’t do it, I’ll find someone who can.”

It’s always the same equation: Sound career management trumps sound engineering.

You’d think those responsible for the health of the organization would take steps to keep DRIVEL out of the decision process. You’d be wrong, because … and this insight is crucial to understanding the situation … in most large organizations the owners don’t want them to.

That isn’t the case in companies with a single owner or whose ownership is closely held. Those owners run the company, make the decisions, and will be around to be damaged by the fallout of bad engineering. Privately held corporations have their flaws, but DRIVEL-driven decision-making isn’t often among them.

Government certainly isn’t like that. We citizens have come to think of ourselves as government’s customers, not as its owners. We want the best deal, that’s all. When it comes to the public sector, many have given up entirely on the concept of a healthy organization. Few consider government’s health to be their responsibility.

In this, they are exactly the same as the owners of most publicly held corporations. They, the shareholders, don’t think as owners do either. Their stock is just a commodity, to be bought low and sold high. The board of directors and chief executive officer, who run the company on their behalf, are their agents, obliged to subordinate personal preference to the shareholders’ will. And we wonder why many are paid enough in one year to retire. (It would be interesting to compare the level of DRIVEL in companies whose shareholders care more about dividends than growth. I’d guess it would be lower, but I have no evidence either way.)

The situation isn’t entirely hopeless. Many Wall Street analysts do pay attention to planning and decisions that pay off in the longer term. They have to. Today’s profits and the next-quarter forecast are already reflected in the stock price. It’s longer-term planning that will cause a share of stock purchased today to be worth more in the future.

It is possible to sell superior engineering to executives. It isn’t easy, and you just have to remember: The engineering itself doesn’t matter to them. In the end they only care about four issues: Increasing revenue, decreasing cost, reducing risk, and how achieving one or more of these goals will affect them personally.

That’s how they define strong engineering. Your job is to help them recognize it when they see it.

I might owe Tom Peters an apology.

Last week I mentioned a Fast Company article in which Peters said he’d faked his data. According to a follow-up in Business Week shared by reader Ed Kimball, it appears Peters (and invisible co-author Robert Waterman) did nothing of the kind.

Peters reviewed and approved the Fast Company article, but now claims Fast Company’s writer, Alan Webber, invented the quote. Webber politely and obliquely disagreed. So either Tom Peters faked faking his data, or Fast Company faked his faking it for him.

Reliable evidence, and relying on evidence, is vital to making smart decisions in business. In Good to Great, Jim Collins quotes Lyle Everingham, CEO of Kroger during its transition from muddling through to 25 years of outstanding performance: “Once we looked at the facts, there was really no question about what we had to do.” A&P, its lackluster competitor, pretended. It created a new store concept, The Golden Key, to test ideas. Its executives didn’t like what its evidence told them so they closed it.

Kroger had a culture of honest inquiry — where executives, managers and employees do their best to use trustworthy evidence to drive decision-making. Creating a culture like this takes work, persistence, and sometimes political dexterity. Here are some specific measures you can take to foster a culture of honest inquiry in your workplace:

  • It starts with wanting to know what’s really going on out there. Enron and WorldCom happened, in part, because their executives were so busy trying to make their companies look good that they obscured what was really going on to themselves. Your dashboards, financial reports, and other forms of organizational listening are to make you smarter. If not, don’t bother.
  • Confidence comes from doubt. Certainty, in contrast, comes from arrogance. If an employee is confident and can explain why, wonderful. If that employee’s certainty pre-empts everyone else’s ability to make their case, the employee is on the wrong side of things.
  • Start every decision by creating a decision process. You don’t have to be in charge to encourage this habit. Just ask the question, “How will we make this decision?” That changes the discussion away from who wins to how to create confidence in the outcome. The results: A better decision, a stronger consensus, and a few more employees who see the benefit of honest inquiry.
  • Don’t create disincentives for honesty. If you ask for honest data, and use it to “hold people accountable,” you won’t get honest data. Why would you? The superior alternative is to employ people who take responsibility without external enforcement, and to create incentives for that kind of behavior. This works much better, and takes less effort.
  • The “view from 50,000 feet” is for illustration, not persuasion. A high-level strategic view is essential for focusing the efforts of the organization. High-level logic, in contrast, is oxymoronic: Detailed evidence and analysis is what determines whether the high-level view makes sense, or just looks good in the PowerPoint.
  • Evidence too far removed from the original source is suspect. Don’t trust summaries of summaries of summaries, especially if they tell you what you want to hear. Even with the best of intentions the game of telephone is in play. And many of those on the side of intellectual relativism don’t have the best of intentions.
  • Be skeptical of those with a financial stake in the decision. But don’t ignore them. A conflict of interest suggests bias, but doesn’t automatically make someone wrong. Be wary and dig into their evidence, especially if their evidence is a summary of a summary of a summary. But if you demonstrate to your satisfaction that they’ve cooked the evidence … go ahead and ignore them from now on. They’ve earned it.
  • Beware of anecdotes and metaphors. They’re useful … for illustrating a point, or for demonstrating that something is possible. For anything else you need statistically valid evidence. Yes, Disraeli said there are three types of lie. He miscounted — argument by anecdote is far more pernicious than argument by statistics, and argument by metaphor is even worse. Yes — you do have to understand statistics well enough to evaluate the evidence. Sorry. That’s part of your toolkit.

If you work in a business culture infected with intellectual relativism, it will take time and patience to build the habit of rationalism. You won’t do so by preaching and lecturing about the general principle.

The way to build a culture of honest inquiry is one decision at a time.