Bad metrics continue to be worse than no metrics because, as Mark Twain famously said, “It ain’t what you don’t know that gets you into trouble. It’s what you do know that ain’t so.”

Which brings us to Deloitte, its Center for the Edge’s Shift Index, and the “Big Shift” it concludes is happening to our economy.

I quote: “The Shift Index highlights a core performance challenge that has been playing out for decades: return on assets (ROA) for U.S. companies that has steadily fallen to almost one quarter of 1965 levels …”

It’s a shocking statistic, strongly suggesting that economic collapse is imminent (oh, wait …), even though, as the report continues, “… while labor productivity has continued to improve,” which hints at some redeeming virtues.

(Note: The full report runs 142 pages and has far more virtues and faults than I can do justice to here. It gives the ROA trend great prominence as a symptom, which is why I’m focusing on it here.)

You’ll recall that the KJR Manifesto specifies consistency as one of the six characteristics of a good metric. Consistency means the metric must always go one way when the item being measured improves and the other way when it gets worse. ROA fails this test. Here’s why:

On the surface, this 45-year private-sector-wide decline seems to reflect an across-the-board failure of management to do its job. It’s a tempting perspective, as it satisfies our shared need to find a group of people who aren’t “we” to blame for whatever we’re unhappy about.

Too bad it doesn’t stand up to close scrutiny.

ROA is a dubious measure, even for assessing the performance of individual companies. It’s too easy to manipulate, and fails the consistency test.

But that doesn’t matter. Big Shifts are a macroeconomic matter, so the question is whether ROA, aggregated across the whole economy, is a useful way of looking at things.

It isn’t.

Something investors know well is that different industries have radically different asset requirements. Comparing ROA across industries doesn’t work.

And yet, in 1965 the U.S. economy depended heavily on manufacturing. Since that time, as you might have heard, we decided manufacturing belongs in China. Our economy now relies much more on finance, service, and entertainment.

Interestingly enough, finance, service, and entertainment seem to have far lower ROAs than manufacturing. Might this be the Big Shift that has caused the huge fall in economy-wide ROA and not a colossal failure of management?

Nor should we conclude that building an economy on low-ROA industries is a bad idea, because why would we? GDP grew from $719 billion to $14.5 trillion over the same period of time after all, and GDP growth also has some value as a measure of aggregate economic health.

Here’s what’s unfortunate: I strongly support one of Deloitte’s conclusions, namely, that “… the gap between potential and realized performance is steadily widening as productivity grows at a rate far slower than the underlying performance increases of the digital infrastructure,” (although it isn’t just a nitpick to complain that their assessment of the “digital infrastructure” has more to do with amount than with sophistication and capabilities.)

Why might this be? Here are two likely explanations, both regular themes in this space.

The first: Businesses don’t integrate IT into their functioning – either the technology itself or the organization. Instead, IT delivers software that’s supposed to “meet requirements,” leaving it up to its “internal customers” to figure out what to do with it. That’s in contrast to IT collaborating with the rest of the business to design, plan, and implement business changes and improvements … a more enlightened model, but one relatively few companies have embraced.

The second: Far too many companies are equipped with 21st century tools but a 20th century workforce. We have SharePoint. We have web conferencing. We have internal blogs, wikis, and all manner of other tools that can help employees be more effective, both individually and as they collaborate in teams.

And few companies make mastering those tools even a data point in assessing employee performance.

These two factors matter greatly, both to your company’s success, and to our success as a world economic power.

It’s too bad I can’t cite Deloitte’s analysis as supporting evidence.

* * *

Disclaimer: The folks at Deloitte are smart enough to have thought all this through, and know the subject matter better than I do. I’ve forwarded this column to them, and will publish their reply next week if they choose to provide one.

Can you stand one more Steve Jobs retrospective?

I’m not sure I can either. But as so much of what I’ve read seems to have missed, not just the point, but both of the points (there are two) … well, what the heck. And so, let’s try to answer the question, why did Steve Jobs matter?

What Answer #1 isn’t: He started it all. He didn’t. Not even he and Steve Wozniac together get the credit.

Jobs and Wozniac (note to Apple: Aren’t you embarrassed that “Wozniac” doesn’t pass an iPad spell-check?) (10/11/2011 note to readers: Yes, this was my mistake – “Wozniak” passes the spellcheck just fine! – Bob) were part of a thriving community of people fascinated by the potential for personal computers. They built, not the first PC, but the first commercial PC … the first that could do something useful for mere mortals right out of the box.

Why this doesn’t matter all that much is that, had they not done so, someone else would have before much more time had passed. They got there first, and deserve credit for it. But it isn’t why Steve Jobs matters.

What Answer #1 is: He made the user interface important. As you have to know, Jobs had nothing to do with the invention of the graphical user interface. Douglas Engelbart did most of that work, with a team at Xerox PARC labs finishing things up.

But Engelbart lacked influence, and Xerox lacked interest. Jobs saw the result and had the key insight: The user interface, all by itself, mattered, independent of the specific use to which anyone put it. This led to the Macintosh, which led to Windows, making the graphical user interface ubiquitous.

The notion that making computers easy for humans to figure out should be the centerpiece of the design effort was uniquely his.

Or, if you’re more cynical, Steve Jobs was partially responsible for the ongoing dumbing down of the American public. The downside of an immensely friendly user interface is that the user has less learning … and thinking … to do.

Think of the GUI as the automatic transmission of computing, encouraging more drivers, including worse drivers, to share the road.

What Answer #2 isn’t: Solving hard problems. Don’t get me wrong – smartphones and tablets matter, and while Jobs no more invented these than he invented the PC, without him they’d be a whole lot less interesting and transformational. But nothing about them was hard.

Apple has nothing that corresponds to the old Bell Labs or IBM’s Thomas J. Watson Research Center. Apple won’t invent the next transistor or scanning tunneling electron microscope, nor will it discover anything as consequential as the cosmic background radiation.

Heck, iOS doesn’t even provide e-inking as an operating-system-level service, let alone something as important, relevant, and difficult as handwriting recognition.

Apple takes on the easy problems. This isn’t an indictment. Figuring out new and interesting things you can do with what has already been solved is tremendously useful. It shouldn’t be confused with doing something hard, though.

What Answer #2 is: Providing a leadership model lots of companies should follow. From what we know, Jobs didn’t lead as CEOs should lead. He was an autocratic micromanager, unable to delegate and deeply involved in the details of product design. Let’s hope, for Hewlett Packard’s sake, that Meg Whitman doesn’t follow suit.

But, and these are crucial, Steve Jobs focused on potential, and he insisted on excellence.

Nothing about the iPod, iTunes store, iPhone, or iPad was safe. Jobs focused on upside potential, not downside risk. Like the great generals in history he preferred offense to defense.

As for excellence, he insisted on it in the technical as well as general meaning of the word. Technically, “excellence” refers to the presence of desirable features people want to buy, as opposed to “quality,” which refers to the absence of defects … a characteristic Jobs didn’t particularly care about.

Under Steve Jobs, every Apple product had to be so desirable that its customers were willing to pay premium prices for its designs while forgiving it for serious lapses in quality.

So pay attention, because this is why Steve Jobs mattered most: He had no obvious interest in “maximizing shareholder value” or increasing Apple’s profits. He was far from the highest-paid CEO in America and never seemed to worry about that, either.

Along with Bill Gates and almost nobody else at the helm of a corporate behemoth, he personally loved his company’s products.

Like Bill Gates, Steve Jobs demonstrated that if a company’s CEO focuses on building products people want to buy, the rest will happen.

Imagine if General Motors had been run like that.