Culture is the new governance, and where it isn’t, it should be.

As my co-author Scott Lee and I pointed out in The Cognitive Enterprise, culture provides metaphorical lane markers. Formal governance mechanisms are more akin to guard rails — it you make contact with either one, something’s gone badly wrong.

Only sometimes, even culture is overkill.

Take posted speed limits. If you obey them because otherwise you might get a speeding ticket, that’s governance. If you drive five miles an hour faster than the posted limit, that’s culture — following an unwritten but near-universally accepted modification to what formal governance requires.

But when it comes to the choices drivers make about their velocity, governance and culture only matter when a far more powerful regulatory force isn’t in play — traffic.

When embedded in traffic, governance, culture, and personal driving preferences don’t matter. If the posted limit is 50 mph but the cars surrounding you are moving at a uniform 30 mph, you’ll drive at 30 mph.

It’s akin to states of matter. Light traffic is parallel to how gases behave — each molecule (car) moves along on its own with only infrequent interactions with other molecules. On public roads we don’t want these interactions to happen — they’re called “collisions” — which is why we have posted speed limits.

Heavier traffic is akin to the liquids, where fluid flows supplant individually independent molecular action. Driving in traffic is liquidity in action.

Add even more traffic and we discover how water molecules must feel when the temperature drops below freezing. Traffic jams and solid matter have a lot in common — nobody, whether drivers or molecules, is going anywhere.

(Physics minded readers might be wondering how the fourth state of matter — plasmas — fits into the picture. At the risk of beating the metaphor to death … race tracks?)

How does this fit into the broader subjects of culture, formal governance, and the decisions and results you, as an enlightened driver … no, wait, as an enlightened business leader … want to accomplish?

Heck, I don’t know. I just like the metaphor.

Not good enough? Okay, let’s poke at this and see where it takes us.

Most of us, most of the time, think about governance in such contexts as boards of directors, business change steering committees, and architecture review boards. At their best they help the organization maintain a fluid state, where everyone’s efforts pretty much line up with everyone else’s efforts, moving forward without a lot of high-impact collisions to disrupt the smooth flow of things.

Except that, for the most part, when the organization is already in a fluid state, traffic and culture make governance superfluous.

Part of effective governance is recognizing when not to say yes. Saying yes too much is like letting too many cars onto a road not designed to handle so much traffic. Effective governance tries to keep things in a fluid state so the organization doesn’t freeze up into solid-state immobility.

What counts as organizational gas? Consider so-called “shadow IT,” where business departments implement applications they need but that IT lacks the capacity to deliver (see “saying yes too much,” above).

Most of the German autobahn legendarily has no speed limits — it’s a gas.

But from Wikipedia: Any person driving a vehicle may only drive so fast that the car is under control. Speeds must be adapted to the road, traffic, visibility and weather conditions as well as the personal skills and characteristics of the vehicle and load.

When it comes to shadow IT, this isn’t bad guidance. We might imagine shadow IT governance following this sort of model, where driver’s education courses take the place of speed limits. You don’t want a shadow-IT free-for-all any more than Germany wants insane driver behavior on its roads.

On the other hand, forbidding business departments from using suitable information technology because IT lacks sufficient bandwidth amounts to … well, forget the metaphor. Refusing to allow business departments to operate at maximum effectiveness because that’s how your governance works changes risk management from one enterprise good among many to the only factor taken into consideration.

As for plasma: How about research and development? You want to encourage it, but in a safe environment … a metaphorical race track … where only trained drivers are allowed.

I’ve probably pushed this metaphor beyond its limits.

Still and all, I think it’s fair to say that too often, governance devolves into stifling, choking bureaucracy. With the right culture it’s needed far less often than it’s imposed, and when imposed it focuses on reducing costs and risks much more than on increasing revenue and opportunity. And often, traffic makes it unnecessary.

“I’m just giving you a brain dump.”

Please don’t. Not to me, not to your colleagues, and especially, no matter how dire the circumstances, not to your manager.

Start with the prevalent but inaccurate distinction between data and information. Data are, supposedly, meaningless until processed into meaningful and useful information.

Not to nitpick or nuthin’ but “information” already had a definition before this one came along. It comes, appropriately enough, from information theory, which defines information as the stuff that reduces uncertainty.

As long as we’re being annoyingly pedantic, far from being worthless, data consist of indisputable facts: A datum is a measurement of some attribute of some identifiable thing, taking measurement in its broadest sense — if you observe and record the color of a piece of fruit, “orange” is a measurement.

So a fact can, in fact (sorry) reduce your uncertainty, as in the case where someone has asserted that something is impossible. If you observe and document it happening even once, you’ve reduced everyone’s uncertainty about whether the phenomenon in question is possible or not.

As long as we’re being metaphysical, let’s add one more layer: Meaning isn’t something information confers. Meaning is a property of knowledge — something a person develops, over time, by interpreting their experience, which is a combination of raw data, information, and logic, and, if we’re being honest with ourselves, no shortage of illogic as well.

(If, astonishingly, you’re interested, Scott Lee and I covered this topic in more depth in The Cognitive Enterprise.)

Back to brain dumps. You might think the problem is that the dumper is providing data, not information. Au contraire, mes amis. In my experience, brain dumps contain precious little data. They are, instead, a disorganized jumble that does include some information, interspersed with anecdotes, opinions of varying degrees of reliability (the brain-dumper would consider these to be knowledge), and ideas, which, as we’re being definitional, we might think of as hypotheses only without the supporting logic that makes good hypotheses worth testing.

And so, now that I’ve thoroughly buried the lede, the reason brain dumping is generally worse than useless is that it’s an exercise in reverse delegation.

Brain dumps happen when one person asks another person to figure something out and then explain it so they’ll both be smarter about the subject at hand.

But instead of making the delegator smarter, the brain-dumper has instead de-delegated the hard work of organizing these bits and pieces into a clear and coherent narrative.

It’s as if I were to assign you responsibility for baking a cake, and to satisfy the assignment, instead of returning with my just desserts, you were to dump a bunch of raw foodstuffs on my desk, some of which might be useful as cake ingredients and others not, along with 23 recipes for pies and cakes, plus commentary about how eating too much sugar causes cavities and adult-onset diabetes.

When receiving end a brain dump I often conclude the dumper has lost track of the explanation’s purpose. Instead of trying to make me smarter about a subject, the presenter is, instead, trying to show me how smart he or she is.

But it’s more likely I’ll reach the opposite conclusion, due to one of Einstein’s dicta: “If you can’t explain it simply, you don’t understand it well enough.”

Bad meta-message.

How can someone keep themselves from becoming a brain-dumper? Here’s one approach: Start by carefully choosing an entry point.

Imagine I’m supposed to explain something to you. Presumably I know quite a lot about the subject at hand or you wouldn’t ask. I know so much, in fact (this is, you understand, hypothetical) that I can’t explain anything I know about it until you understand everything I know about it.

And as you won’t be able to understand anything I have to say about it until you’ve heard everything I have to say about it, my only choice is to dump the contents of my brain onto your desk.

But if I choose a good entry point I’ll be starting my explanation with something about the subject you can understand immediately, like, “We have a problem. Here’s what it is, and why you should be concerned about it.”

Then comes the second-hardest part: Leaving out everything you know about the subject, except what helps explain what the problem is and why your listener should be concerned about it.

Leaving out any of my precious knowledge out hurts.

But that’s better than the pain I’d inflict by leaving it in.