Isaac Asimov once told the tale of the world’s greatest surfer, a legend in his own mind, if nowhere else. Tired of hearing him brag, his audience challenged him to demonstrate his skills. So, taking surfboard in hand, he ran to the water’s edge where he stood still, gazing over the waves.

“Why don’t you go in?” taunted the crowd.

His response: “We also surf who only stand and wait.”

Identifying the next big wave is a big challenge in our own industry, too, as is knowing when to start swimming. I alluded to this problem in my Jan. 12 column, talking about the need for CIOs to identify new and promising technologies and to actively search for their potential business impact. (See “If you wait for business needs to drive technology buys, you will fall behind.”) This, I think, is at least as important as responding to requests from business leaders.

This is an important idea. It isn’t, however, as original as I’d thought. I found this out by reading Clayton Christensen’s new book The Innovator’s Dilemma: When New Technologies Cause Great Firms to Fail, which told it first and better.

I find books like this annoying. Christensen came up with my idea years before I did, and had the nerve to research it extensively and develop it into a well-thought-out program for developing and implementing corporate strategy.

How’s a poor columnist supposed to maintain his reputation for original thinking, anyway?

Christensen divides innovation into two categories, sustaining and disruptive. Sustaining innovation improves service delivery to existing markets. Disruptive innovation, in contrast, is initially irrelevant to existing markets but improves faster than market requirements until it can invade a market from below. For example:

Mainframe computers experienced sustaining innovation for years, steadily improving their price-performance characteristics. Minicomputers, less capable, were a disruptive innovation. Completely incapable of handling mainframe chores at first they found entirely new markets — in scientific computing, shop floor automation, and departmental applications. Companies like Digital and Data General got their start not by competing with IBM (IBM asked, and its customers had no interest in minicomputers at the time) but by finding new markets for their products too small for IBM to care about.

Minicomputers never did overtake mainframes in capacity. They did, however, overtake the requirements of much of the mainframe marketplace, invading from below and draining away a significant share of the market.

Companies miss the opportunities presented by disruptive technologies because they listen to their customers and deliver what those customers want. Disruptive technologies appeal to entirely different (and much smaller) marketplaces at first, so listening to customers is exactly the wrong thing to do.

Now think about how IS organizations deal with disruptive technologies. That’s right, this isn’t just an academic question. This is your problem we’re talking about.

Remember when PCs started floating into the organization? The average CIO sees business executives as IS’s “customer” and delivers what they ask for. PCs held no appeal for the CIO’s “customers.” PCs were useful to analysts, clerks, and secretaries — an entirely different market too clout-free to be visible to the CIO — until it was too late.

Eventually, networks of PCs did start solving more traditional information processing tasks, and IS knew less about them than the end-user community.

Right now you’re faced with quite a few potentially disruptive technologies — personal digital assistants, intranets, and computer-telephone integration, to name just three. How do you plan to deal with them?

Here’s one plan, based on ideas from The Innovator’s Dilemma: Charter one or two small, independent groups of innovators. Detach them from IS so they aren’t sidetracked into mega-projects.

Tell them to start small and find ways to make these new technologies beneficial to the company.

And then, most importantly … leave them alone.

Two recent Microsoft communiques have sent my Absurd-O-Meter off the scale.

First there’s the Microsoft ad imploring you to install Windows NT Workstation because it’s faster, more robust, and yada yada yada. What will you be upgrading from? Windows 95, of course, which I guess must be slower and more fragile.

Who writes this copy? News flash to Microsoft: You’re supposed to badmouth your competitors’ products, not your own.

Then there’s the new Palm PC, for which the company that once claimed ownership of the word “windows” borrowed “Palm” from a popular personal digital assistant. It’s going to run — I’m not making this up — a “stripped down version of Windows CE.”

Shows what I know. I thought Windows CE was the stripped down version. WinCE indeed.

To punish the perpetrators of this nonsense, this week we’ll discuss defenestration (OK, I’m reaching) strategies.

We can’t throw out Windows on the desktop. That really is too risky. Instead …

Microsoft has built its Windows NT Server strategy on Moore’s Law (which predicts that the bang-per-buck ratio will double every 18 months) and on the average CIO’s overall nervousness regarding Unix.

In the minds of most CIOs, NT is safer and easier to learn than Unix’s notorious grab-bag of in-joke commands and semi-compatible versions. It’s more versatile than NetWare, since it can act as both file and application server.

And, it doesn’t threaten MVS (now OS/390) because comparatively, it’s still a toy.

That’s fine with Bill Gates. With every iteration of Moore’s Law, NT can take over more of OS/390’s turf, even without Microsoft investing in product improvement.

But NT is, of course, far inferior to many of its head-to-head competitors, at least from the perspectives of performance and stability. Be honest. Isn’t there a part of you that wishes you could use Linux instead? Too bad you can’t take the risk.

Well, you can run part of your business on Linux with no risk at all. All you have to do is break free of the we-gotta-have-a-standard mentality and replace it with a what-standards-do-we-need mentality.

When it comes to Web servers you don’t need a standard operating system, because Web server software shields everyone from the OS. You can run your whole corporate Intranet on Linux (or Netware, or Solaris, or a different OS on every server) with no compatibility or integration worries. And since most of your NT alternatives can handle at least twice the processing load as NT on a given piece of hardware and are more stable besides, there’s a direct business benefit. (Truth in packaging department: In saying this I’m relying on reviews and the opinions of knowledgeable friends, not on direct experience.)

How about the dreaded cost of training? I’ll bet you have a few adventurous employees who’d be absolutely delighted to invest personal time learning Linux on their own, and some Netware hold-outs who’d be thrilled to extend that environment rather than have you phase it out despite its technical superiority.

Cost of administration? C’mon, these are Web servers. You don’t have thousands of logins to administer … and besides, Unix and Netware both have very strong tools for administration.

Okay, I hear you say, but what if Linux (or Netware, or whatever) vanishes from the landscape?

No problem. Simply install Cairo (it should ship by then) and copy the files. Since you’re dealing with true cross-platform standards, you’re safe.

Each platform decision you make has its own risk/reward dynamics. When you enforce a one-size-fits-all strategy you lose your ability to optimize.

The beauty of this defenestration strategy is that while your company benefits, you help preserve a diverse operating system marketplace.