Long-time readers of this column might imagine I don’t like business metrics. But if you imagine that, you’re wrong. It’s true that my columns on the subject are more cautionary than laudatory. But I love good business metrics, and treasure the rare occasions on which I encounter them.

Most business metrics I run across are either shallow or absent entirely. Mostly, managers adopt industry standard metrics. But it’s okay, because companies are adopting industry-standard business processes as well in their ongoing attempts to become generic.

Last week’s column was an attempt to point the way toward one important goal of a well-designed system of business metrics — the establishment of an organizational dashboard. The critical point, you’ll remember, is to start by defining a conceptual hierarchy of performance goals and sub-goals for the organization as a whole, pointed at the factors required for achieving organizational effectiveness. Metrics are crafted to assess how well you’re progressing toward your goals, not to roll-up from individual performance to workgroup, workgroup to department, and so on.

And especially, the metrics must not be divided according to the organizational chart so as to Hold People Accountable (HPA). When you build a dashboard of key organizational performance metrics, do the opposite: Construct them so as to make the organization’s leaders interdependent. Otherwise, leaders will make their numbers at each others’ expense, the organization will fragment into silos, and nobody will understand why the system collapsed, because all of the metrics looked good.

Still, the absence of HPA leads to its own problems, and without reasonably accurate ways to assess individual and group performance, HPA is an exercise in authoritarian futility: You declare the quality of performance with nothing to back it up; those reporting to you accept your authority to do so without accepting your assessment if it disagrees with their own.

The resolution to this apparent paradox, to the extent it’s resolvable, starts with the recognition that there are two separate, equally valid but entirely different views of the organization. One — the dashboard view — dissects the organization into the factors that drive performance. These cross all boundaries, which is why the organization’s leaders are interdependent when dealing with them.

The other view is functional, divvying up responsibility for who delivers what. It’s the orgchart view — the perspective that leads to HPA.

The organizational chart is a form of delegation. Done right it assigns responsibility and commensurate authority. Doing so without creating organizational silos is one of the most difficult tasks faced by business leaders. Dividing the work of the organization into defined responsibilities and delegating them to various managers so everyone knows who is supposed to do what is, after all, the only known alternative to organizational chaos. Once you do, your managers should establish clear internal organizational goals and metrics of their own, for the same reason you need to do so for the organization as a whole — to understand the health of their organizations.

You can’t entirely avoid the creation of organizational silos, because it is the purpose of the exercise. To illustrate the issue, imagine you have the simplest organizational chart in the IT world — Applications and Operations, and nothing else.

They’re natural enemies. The fastest and most efficient way to code, after all, is sloppily. But this drives up the cost of building and managing the IT infrastructure while increasing the likelihood of system crashes. Meanwhile, the easiest way to maintain stability and performance is to never upgrade or change anything.

Beyond holding managers collectively accountable for dashboard results, you have one more tool at your disposal: remembering the point of it all — in a word, the mission (which is distinct from, and usually independent of the dreaded Mission Statement).

The performance goals, dashboards, organizational chart and all the rest are tools, recognizing that with few exceptions the most effective way to fulfill any mission is to build a high-performance organization capable of its fulfillment.

But in the end it’s the mission — the goal — that’s the point. To avoid the creation of barriers between functional areas of your IT organization, keep the point in front of all managers, always. And establish, clearly, that making the numbers at the expense of the point isn’t making the numbers at all.

It’s inverting means and ends.

Where exactly did the term “silo” come from, anyway?

Organizational silos are Bad Things. They create barriers to getting work done. But why silo? What does a building used to store grain or a nuclear missile have to do with branches of the organizational chart? But then, I wonder how agribusiness executives respond to the idea of breaking down their silos. For that matter, I worry about using “lowest common denominator” when we should be saying “greatest common factor,” so what’s that tell you?

Speaking of the relationship between mathematics and organizational silos, let’s talk about dashboards, business metrics, and how they should and shouldn’t fit together.

“Metric” is ConsultantSpeak for “measure.” Some use “metric” to refer to the formula and “measure” to the result of each act of measurement. Since the ontological battle against “metric” is long since lost, let’s agree to accept this distinction.

Good metrics start with goals and end with fine tuning, and not, as some consultants lazily suggest, with industry benchmarks or other forms of meaningless tradition. Figure out what you’re trying to achieve, develop an observational way to determine whether you’re making progress toward it — preferably in numeric terms — and then make sure there’s no way to manipulate the measures so they improve while the actual situation deteriorates.

A popular addition to metrics lore is the dashboard. Unlike “silo,” “dashboard” is a useful metaphor, which is to say it nicely illustrates the point. When you want to know how your car is doing, the dashboard tells you, at a glance, whether your car is healthy and progressing at the right speed toward your destination. A well-designed business dashboard helps you understand how healthy your organization is, and whether it’s progressing at a fast enough pace toward its destination.

So far, so good. But when designing dashboards, most consultants, and as a result many managers, fall into a trap: They think the point of metrics is to Hold People Accountable, and so they design a business dashboard whose gauges are each tied to a branch of the organizational chart, or where individual manager results roll up to the whole organization’s results. Since you get what you measure, the inevitable happens — each manager does whatever it takes to move his or her measures, almost always hampering the ability of other managers to achieve their metrical responsibilities. Bad dashboards, that is, cause silos (which, if you have a bizarre and twisted mind, means the bad use of a good metaphor causes a bad metaphor).

Criticizing other consultants is fun, and possibly good for business, but knowing how to build a bad dashboard doesn’t do much to help you build a good one.

When our company works with clients on this subject we start with our handy-dandy IT Effectiveness Framework, which enumerates four major factors that drive IT organizational effectiveness: Business Alignment, Process Maturity, Technical Architecture, and Human Performance. The CIO establishes a clear goal for each, based on what most needs to be addressed in his or her particular situation, and formulates one or more metrics for each goal, so as to assess progress.

Each of these factors has a number of sub-factors that make it work. For example, technical architecture depends on the platform (IT infrastructure) architecture, information architecture, and applications architecture. The CIO often defines goals and metrics for this level as well. (So far, nobody has wanted to develop goals and measures for all of the 138 factors we’ve identified that drive IT performance, but hey — if you’re willing, we’re willing to help.)

At the dashboard level of analysis, no one manager is logically accountable for how well the measures come out. The director of Application Support can’t, for example, improve business alignment single-handedly, nor will averaging the business alignment scores for Application Support, Operations, and the rest of IT result in a meaningful number. What Application Support can do is define its goals in ways that support the organizational strategy, including its business alignment goals, and then one or more metrics to assess progress toward the Application Support goals.

It’s a simple principle. If your goal is teamwork, define organizational goals that require interdependent effort. If, on the other hand, your goal is to Hold People Accountable, establish independent goals. And metrics.

Oh, and while you’re at it, you might as well add a siloization meter to your dashboard, because that will be the most visible outcome.