From the IS Survival Mailbag …

My recent column on the Year 2000 raised both the ire and scholarship of the IS Survivalist community.

Quite a few readers took me to task for proposing that both the “decadist” camp (the millennium ends on the decade boundary … December 31, 1999) and the “centurist” camp (December 31, 2000) have legitimate claims.

A few disagreed with my fundamental premise, insisting 1990 was the last year of the 1980s. To them I respond, “thlppp!” Never let it be said I stray from the high road.

Others explained that decades don’t matter — since there was no Year 0, 2,000 years won’t have passed until midnight, December 31, 2000. Jim Carls wrote to explain why they’re wrong:

“If you look it up in your history book, do the math and assume that historical accuracy is of some importance in defining the start of the Third Millennium Anno Domini, the latest point at which the millennium could start was in 1997 (Herod the Great died in 4 BC). And, according to Stephen Jay Gould (interviewed last week on PBS), the latest possible date was October 23rd. Let’s all bring that up in the next planning meeting!”

I say we start celebrating December 31, 1999 and don’t stop until January 1, 2001. Which means the real Year 2000 crisis will be a severe grape shortage. Vineyards … start planting!

Another group of correspondents took issue with the idea that the two-digit year was a feature, rather than a bug, and that it made good business sense at the time. Quite a few e-mails pointed out that a four-byte integer field could have stored 179 years worth of dates avoiding the problem for some time to come. Others questioned how much money programmers saved by not using a four-position year.

The first group would be right if storage were the only issue. Backtrack 25 years, though, and figure out how many iterations of Moore’s Law we have to undo. Computers had, I’d guess, about 1% of today’s processing power. The computation time needed to convert dates to and from integer format would have greatly extended batch processing times, which would have been very expensive. Tim Oxler invites everyone to visit a Web page he put up to discuss this in more detail: http://www.i1.net/~troxler/html/space.html.

The second group raises an interesting question. Leon Kappelman & Phil Scott answer it at http://comlinks.com/mag/accr.htm. Short version: The savings have been huge, far in excess of even the largest Year 2000 cost estimates.

And then there’s the other point — my contention that the world will muddle through as usual, neither blowing up nor sailing through unscathed. Robert Nee wrote to formulate this more precisely. He points out that the basic laws of supply and demand in a market-based economy predict that for every company that goes bankrupt due to Year 2000 problems there will be others that pick up the slack, both in terms of supplying goods and services, and in terms of employment.

This is a wonderful insight. Yes, lots of companies will fail. Yes, lawyers will file trillions of dollars worth of lawsuits, bayoneting the wounded to make sure as few companies recover as possible. (To the gathering flock of vultures now soliciting Year 2000 whistleblowers I’d like to make a simple comment. I’d like to, but I’m not sure libel laws permit it.)

In the end, though, demand will drive supply and so long as whole industries don’t fail, suppliers that are Year 2000 compliant will buy the bloody remains of those that aren’t, providing enough supply to satisfy demand and enough employment to keep everyone working as they do so.

Which, in turn, hearkens back to another point made frequently here: Many of the best investments in IT are those focused on your company’s survival, whether they’ll deliver measurable returns or not.

Two recent Microsoft communiques have sent my Absurd-O-Meter off the scale.

First there’s the Microsoft ad imploring you to install Windows NT Workstation because it’s faster, more robust, and yada yada yada. What will you be upgrading from? Windows 95, of course, which I guess must be slower and more fragile.

Who writes this copy? News flash to Microsoft: You’re supposed to badmouth your competitors’ products, not your own.

Then there’s the new Palm PC, for which the company that once claimed ownership of the word “windows” borrowed “Palm” from a popular personal digital assistant. It’s going to run — I’m not making this up — a “stripped down version of Windows CE.”

Shows what I know. I thought Windows CE was the stripped down version. WinCE indeed.

To punish the perpetrators of this nonsense, this week we’ll discuss defenestration (OK, I’m reaching) strategies.

We can’t throw out Windows on the desktop. That really is too risky. Instead …

Microsoft has built its Windows NT Server strategy on Moore’s Law (which predicts that the bang-per-buck ratio will double every 18 months) and on the average CIO’s overall nervousness regarding Unix.

In the minds of most CIOs, NT is safer and easier to learn than Unix’s notorious grab-bag of in-joke commands and semi-compatible versions. It’s more versatile than NetWare, since it can act as both file and application server.

And, it doesn’t threaten MVS (now OS/390) because comparatively, it’s still a toy.

That’s fine with Bill Gates. With every iteration of Moore’s Law, NT can take over more of OS/390’s turf, even without Microsoft investing in product improvement.

But NT is, of course, far inferior to many of its head-to-head competitors, at least from the perspectives of performance and stability. Be honest. Isn’t there a part of you that wishes you could use Linux instead? Too bad you can’t take the risk.

Well, you can run part of your business on Linux with no risk at all. All you have to do is break free of the we-gotta-have-a-standard mentality and replace it with a what-standards-do-we-need mentality.

When it comes to Web servers you don’t need a standard operating system, because Web server software shields everyone from the OS. You can run your whole corporate Intranet on Linux (or Netware, or Solaris, or a different OS on every server) with no compatibility or integration worries. And since most of your NT alternatives can handle at least twice the processing load as NT on a given piece of hardware and are more stable besides, there’s a direct business benefit. (Truth in packaging department: In saying this I’m relying on reviews and the opinions of knowledgeable friends, not on direct experience.)

How about the dreaded cost of training? I’ll bet you have a few adventurous employees who’d be absolutely delighted to invest personal time learning Linux on their own, and some Netware hold-outs who’d be thrilled to extend that environment rather than have you phase it out despite its technical superiority.

Cost of administration? C’mon, these are Web servers. You don’t have thousands of logins to administer … and besides, Unix and Netware both have very strong tools for administration.

Okay, I hear you say, but what if Linux (or Netware, or whatever) vanishes from the landscape?

No problem. Simply install Cairo (it should ship by then) and copy the files. Since you’re dealing with true cross-platform standards, you’re safe.

Each platform decision you make has its own risk/reward dynamics. When you enforce a one-size-fits-all strategy you lose your ability to optimize.

The beauty of this defenestration strategy is that while your company benefits, you help preserve a diverse operating system marketplace.