HomeIndustry Commentary

Cwazy wabbits

Like Tweet Pin it Share Share Email

This year’s must-read business book … and by must-read I mean you must read it because every other manager is reading it … is Steven Spear’s Chasing the Rabbit (2008).

Fortunately, it would be worth your time to read, even if it wasn’t a must-read book. Like Jim Collins’s Good to Great (2001) and Joyce, Nohria and Roberson’s What Really Works (2004), Spear dug deeply into how several outstanding organizations (high-velocity “rabbits”) operate to extract common operating principles.

And came up with a different formula, proving once more that as someone once said, management science is to science as plumbing is to hydraulics.

Not that I’m disparaging Spear’s research. The book does a terrific job of explaining how it is that such disparate organizations as Alcoa, Southwest Airlines, the Naval Reactor Program (yes, some government agencies operate exceptionally well), and of course, Toyota function so much more effectively than their brethren.

Spear concludes that the usual characterizations focus too much on the artifacts of technique and not enough on the underlying philosophy — that for any organization to run well it first has to understand how it works. The more complex the system, the more important this is, which is why the usual disciplines aren’t front and center. As Spear puts it, each of these organizations “… gave up depending on designing perfect processes and committed itself to discovering them instead.”

Specifically, these organizations:

1. Recognize problems as they occur. They consider every single unexpected event, close call, minor accident or anomaly to be a symptom of insufficient knowledge. So rather than just muddling through, as (for example) General Motors assembly line workers do by carrying awls to manually align the bolt-holes that attach seats to frames, Toyota recognized that misaligned bolt-holes meant the company didn’t yet fully understand how to assemble a car.

(In this vein, we’ve often recommended to our clients that they consider every call to the Help Desk a preventable situation.)

2. Fix problems rather than muddling through. General Motors, failing to recognize the misaligned bolt-holes as a problem, never bothered to identify and fix the root cause. By extension, it failed to identify and fix the root causes of an unknown number … probably a very large number … of other problems. Toyota, in contrast, long-ago fixed the problem: Seats arrive at the bolting station perfectly aligned and ready to be attached.

“Rabbits” (high-velocity organizations) consider every root cause to be an urgent situation requiring immediate attention and resolution.

3. Spread the knowledge. This is the one we’re all going to hate, because spreading the knowledge inevitably means documented procedures, lots of them, and rigid adherence to them.

This doesn’t have to mean ossification. The Rabbits are open to better alternatives. But because they understand deeply how things work, innovations won’t be and can’t be improvisations.

Another consequence: New employees need quite a long time to learn their trade, because an employee who doesn’t know how to do each job according to spec creates a risk of throwing the whole machine out of kilter.

4. Have leaders who train other leaders to lead this way. The usual corporate dysfunctions, replete with knowledge hoarding, internal competition, and organizational silos, would prevent the first three capabilities from ever taking hold. Leadership is essential, and the usual set-goals-and-attain-them-and-never-mind-how mentality that dominates current business thinking would be ruinous.

Keep the Joint Running has promoted the importance of a “culture of honest inquiry” for quite some time (see “Where intellectual relativism comes from,” 10/17/2005; also Chapter 12 of Keep the Joint Running: A Manifesto for 21st Century Information Technology). A bit of self-promotion here: My consulting company, IT Catalysts, developed a comprehensive model of the factors required for IT organizational success back in 2004 and has been fine tuning it ever since for this exact reason.

So of course I like this book — it reinforces my biases. I’m predisposed to endorse it without subjecting it to scrutiny.

So next week I’ll do exactly that. This week I’ll apply it to last week’s subject: Prevention, and the immeasurability of its successes.

To those correspondents who recommended comparison to baseline incident levels to be a solution: This works well … for non-catastrophic incidents like server outages that occur in a statistical universe. Prevention of rare or one-time calamities (we’re using our current economic situation as an exemplar), doesn’t fit that model.

Our biggest challenge is that economists don’t understand macroeconomics the way Alcoa understands aluminum manufacturing. Worse, too many influential economists benefit personally from the approval of those who profit when a particular theory sets public policy rather than some other theory.

Which means economics isn’t regulated by a culture of honest inquiry.

We’ve seen the results.

Comments (7)

  • I’m having trouble making the misaligned bolt hole analogy work for I.T. There’s way too much randomness in our field. Every device is a concoction of different hardware, software, updates, user actions, etc., so it’s pretty remarkable when things work at all, let alone have everything fit into place perfectly.

    I’m trying to look at the bigger picture, however.

    • Perhaps an example will help.

      The point about the misaligned bolt-holes is that at Toyota, the line workers are encouraged (actually, required) to report something like this as a structural problem that requires resolution, and participate in figuring out what’s going wrong and what to do about it.

      At GM, line workers have no such opportunity and instead figure out work-arounds.

      In too many IT shops, if (for example) a server crashes, a sysadmin brings it back up … and once it’s back up, that’s considered the end of the story. If it turns out later on that the server failed because (again for example) increasing traffic caused it to exceed its capacity … well, eventually someone will figure it out, once its crash frequency becomes large enough to be too big to ignore.

      This is true even if the sysadmin knows why it failed and wants to fix the root cause, because factors such as “we have no budget to add memory” mean the sysadmin knows there’s no point in even trying.

  • Bob, Thank you again for another one on the mark. While I may no longer be in IT I do get something out of your messages which is why I still subscribe. This time I was plesantly surprised to see mentioned the Naval Reactor Program. My oldest son is a graduate of that program and is currently aboard USS Nimitz deployed to the Western Pacific. I’ve printed this weeks message on “Wabbits” off for him and plan to send it this week with the regular Sunday newspaper comics, editorial and Monday football sports section.
    Thanks again for keeping it going in the face of Wall Street and industry in general who just don’t ‘Get It’. I’m glad you do.
    And my offer still stands to buy you the beverage of your choice if you ever get to Philly.
    Regards,
    Frank Walter
    Chief, Cook and Bottle Washer Handy Frank, LLC

  • Hi Bob,

    I think you are spot on the money except for number 3, which I disagree quite strongly on.

    “Spreading the knowledge” doesn’t have to mean rigid adherence to documented processes. It could mean:

    – a culture of apprenticeship and/or mentoring
    – highly integrated and valued social networks, either f2f or web 2.0
    – strong organisational narratives which tacitly reinforce important lessons
    – a culture that encourages “safe to fail” experimentation as an important tool to solve complex or “wicked” problems

    It’s a problem that Knowledge Managers such as myself spend a lot of time wrestling with. I use a model of knowledge activities that feed on each other to promote “knowledge integration” within an organisation. This is just another way to talk about “spreading the knowledge”.

    You can view my Objectives Chain model here. In this model, documented processes (termed in my diagram as “knowledge codification”) are just one of the elements.

    Most critically, the model specifies that documented processes are no use if they aren’t distributed effectively and/or used by people to actually learn.

  • Bob, can we figure out how to kidnap Congress for a couple days and train them in how to avoid intellectual relativism? How about voters? You eschew any more than a tangential jab at the political sphere, but I think I know you know there’s more important work to be done there than in IT or any industry. In fact, their proclivity for intellectual dishonesty engenders the same in the industries that use government as their agent. Pure Darwinism, but headed for a crash – the fitness landscape rewards bad behavior such as eating your own seed corn.

    A propos of nothing relevant to IT, but I had to vent.

    • Flattery will get you anywhere, but since I have neither competence nor aptitude for politics, the best I can do is spread the notion of a culture of honest inquiry and similar notions and hope someone who is good at the trade picks up on them.

      Still, until someone figures out a way that adhering to such a thing would garner votes instead of losing them, I don’t think we’d achieve very much by kidnapping and indoctrinating Congress, or any other political body for that matter.

Comments are closed.