My all-time favorite editing gaffe garbled a column I wrote about Y2K.

What I wrote: “The money saved dwarfed that spent on remediation.”

What InfoWorld printed: “The money saved the dwarfs that spent on remediation.”

I felt like Thorin Oakenshield with a corrupted database.

Speaking of Y2K, my recent column on COVID-19 and what you should do about it (“When Corona isn’t just a beer,” 3/2/2020) included a reminder of the KJR Risk/Response Dictum: Successful prevention is indistinguishable from absence of risk. I used the global, effective response to the H1N1 virus as an example.

Several correspondents reminisced with me about another, even better example: Global IT’s astonishingly effective response to the Y2K bug, and the ensuing certainty among the ignorati that it was all a hoax.

Y2K’s outcome was, in fact, a case study in what David Brin calls self-preventing prophecy. In the case of Y2K the problem of using two digits to represent the year in date fields, with the 19 prefix assumed, was indisputably real. The potential impact should the world fail to correct the problem was, in the aggregate, unknown and probably unknowable. Concerns ranged from the mundane — employees and customers who, according to HR and CRM systems, would have had negative ages — to the alarming but unlikely possibility of computer-controlled elevators plummeting down their shafts.

For a more in-depth account, read “The Lessons of Y2K, 20 Years Later,” Zachary Loeb, Washington Post, 12/30/2019.

Pre-COVID-19 we knew the overall risk of a viral pandemic soon enough to be worth investing in advance preparedness was high. Which virus, exactly when, exactly how contagious and exactly how virulent? Of course not. The Y2K problem was definitive. COVID-19? The lack of in-advance specifics made, for some decision-makers, the fourth risk response (hope) attractive.

About all we know about the risk of future pandemics is that it’s increasing. That isn’t in any doubt because (1) a pandemic only needs one sick person to get things started; (2) every year, Earth has more persons who could become that one sick person; and (3) every year, more and more people travel to more and more destinations, and “more and more” means a higher likelihood that the one sick person could cross borders to spread their disease more widely.

But never mind all that. Observing the global response to COVID-19, we in IT should be busily patting ourselves on the back again … washing our hands before and after we do, of course.

We deserve the back-patting because if it weren’t for IT, and specifically if it weren’t for our investments in: electronic mail; internal chat; file sharing technology; web conferencing systems; secure remote access to business applications; along with, I hope, broadly available training in their use, coupled with, at this stage of our evolution, peer pressure to master at least the basics coupled with peer knowledge-sharing to provide informal support … if the world of commerce hadn’t embraced these technologies and the idea of remote workers they support, your company’s Business Continuity Plan, sub-section Pandemic Response Plan, would be pretty much worthless.

And right now, if it weren’t for these business innovations that quietly took hold over the past decade or so, the current pandemic’s impact on the world economy would be quite a lot worse.

It’s only ten years ago that I wrote “10 sure-fire ways to kill telecommuting” for InfoWorld (3/30/2009). Some readers got the joke. Even those who thought I was serious recognized that telecommuting was far from universally accepted among business leaders and managers.

Among evolutionary theorists, this sort of thing is called a “preadaptation.” It means a species develops some heritable trait or behavior because natural selection favored it for an entirely different reason. Sometime in the distant future the species makes use of it in some entirely different way that gains an entirely different advantage.

For example, fish developed swim bladders to control their buoyancy. Long, long afterward the swim bladders they had as fish evolved into the lungs they needed as amphibians.

Likewise what we used to call telecommuting and now call remote work. Organizations didn’t embrace it because it would make them more resilient in the face of a global pandemic. They embraced the practice because it reduced the cost of business infrastructure, gained access to a broader pool of talent, and let companies construct project teams out of a broader array of employees.

The moral of this story: You can’t predict all the ways a new technology might create value. So don’t let your governance committees stifle experimentation. You never know when an experiment might turn out to be a preadaptation.

What you do know: If you prevent the experiments then they won’t.

Make it stop!

Several decades ago, some wise pundit wrote that CIOs should be business people, not technology people. The resulting article has been republished, with slight changes in paragraph order and phrasing details, over and over again ever since.

None of these repetitions has fixed the fundamental flaw in the original. As I pointed out a year and a half ago on CIO.com, replace the “I” with any other capitalized executive middle letter and see where the logic takes you: CFOs should, according to this logic, be business people, not financial people; COOs should be business people, not operations people; CMOs should be business people whose knowledge of marketing is optional.

And yet, as if the endless repetitions never happened, here comes McKinsey to make it official: For years, we’re now told, executives have stressed the need for CIOs to move beyond simply managing IT to leveraging technology to create value for the business. This priority is now a requirement. (“The CIO challenge: Modern business needs a new kind of tech leader,” Anusha Dhasarathy, Isha Gill, and Naufal Khan, McKinsey Digital, January, 2020).

I suppose I should be gratified. This iteration endorses positions we (“we” being my co-author, Dave Kaiser, and I) took in There’s No Such Thing as an IT Project, (Berrett-Koehler Publishers, September, 2019), not that McKinsey’s authors acknowledged our precedence.

Oh, well.

In addition to the unneeded repetition, The CIO Challenge also makes the Monolithicity Mistake, namely, providing just a single “new” job description all CIOs must abide by. Just as no one strategy fits all businesses, neither will just a single approach to IT leadership.

That being the case, here are a few of the alternatives available to you as an IT leader. Choose one, or create your own hybrid:

Chief IT Officer: While KJR doesn’t generally endorse the old IT-as-a-business-with-internal-customers IT organizational model (see, for example, “Customers vs Consumers,” InfoWorld, October 25, 1999), sometimes it’s the best you can do.

This model does have an advantage: If you’re running IT as a business you can hardly be accused of not being a businessperson. So long as, that is, you really do run IT as a business, complete with its own, independently derived strategy, operating model, and other accoutrements of a standalone corporation.

Chief Integration Officer: Buy when you can. Build when you have to.

As the IT applications marketplace has matured, more and more of the functionality a business manager needs to operate effectively already exists and is ready to license.

That’s in contrast to developing an application in-house, where you haven’t even articulated the user stories that define what it’s supposed to do.

But … license applications from multiple vendors and you’ll find their data models don’t easily mesh.

That’s what makes integration an intrinsically hard problem to solve.

Beyond this, from the perspective of each application’s business owner, integration is someone else’s problem.

Therein lies an opportunity. Embrace so-called “shadow IT.” Let business owners choose their own applications. Limit IT’s role to their integration so that, metaphorically, even though the business owns several watches it still knows the time.

Chief Transformation Officer: All so-called IT projects are really business change projects or what’s the point?

Add to this another level of difficulty when it comes to making business change happen: Most business managers know how to keep things the same — to make sure their areas of responsibility run the same way tomorrow as they did yesterday, with incremental improvements, perhaps, but not dramatically different.

Making transformational change happen just isn’t what they know how to do.

It can be what IT knows how to do, out of self-defense if nothing else. After all, when so-called IT projects don’t deliver business benefit, it’s IT that’s left holding the bag.

Chief IT Infrastructure Officer: IT runs the data center and all of the IT infrastructure needed for business-unit-based application teams to do their work.

This was a thankless model even before cloud computing became popular. Now? If the CEO asks you to assume the role of CITIO, just say yes … to make you’re gainfully employed while launching the job search you start tomorrow.

Chief Strategy Officer: Welcome to the world of Digital-as-a-noun, where businesses shift their emphasis from cost-reduction to revenue enhancement and information technology is assumed, not cost-justified on a case-by-case basis.

Take it a step further: information technology isn’t merely assumed. Each new, emerging technology translates to a potential new business capability. New capabilities potentially translate to new and better products and customer experiences.

In the Digital world, then, IT drives business strategy — it doesn’t merely support it.

One drawback: driving business strategy isn’t something you’d do instead of your current job.

It’s in addition.