People get all excited about the darndest things.

I know otherwise normal people who froth at the mouth when they hear me say the millennium starts Jan. 1, 2000. No, they insist angrily, it begins Jan. 1, 2001. Don’t I know anything?

Well yes, I do. I know it’s more a matter of opinion than the hard-liners think. Why? Let’s begin with a startling realization: Decades and centuries don’t line up!

Decades, named by their first year, number 0 through 9, so the 1990s are named for the year 1990 and end Dec. 31, 1999. Very few people claim the year 2000 is part of the 1990s.

Centuries number from 1 through 100. That makes sense — this is, after all, the 20th century, so the year 2000 had better be a part of it.

The question of when the millennium begins, then, all boils down to this: Does it begin with a new decade or a new century? I say it starts with the new decade, in 2000. You’re free to wait until the new century begins, but I’m guessing you’ll miss an awesome party on Dec. 31, 1999.

And you won’t get to attend one the following year, because the world will, of course, end in the year 2000, destroyed by ubiquitous computer failures.

Just kidding. As it always does, the world will muddle through, saved by a mixture of planning, hard work, and improvisation.

I call this column the IS Survival Guide because survival is quite an accomplishment for the working CIO. Surviving the year 2000 will be quite an accomplishment.

Two big myths surround the year-2000 problem. The first is that it’s a bug. The second is that it’s a mess because somehow the end of the millennium snuck up on unwary CIOs all over the world.

Let’s explode these myths right now so you can focus on solving the problem instead of avoiding the blame.

The way we encode dates, or at least used to encode dates, was an intelligent design decision back in the 1960s and 1970s when in-house and commercial programmers wrote most of our legacy systems. Storage — both RAM (we called it “core memory” back then) and disk — cost lots of money, and the best programmers were those who could squeeze the most performance into the smallest computing footprint. Saving 2 bytes per date field made all kinds of business sense, and nobody figured these systems would have to last three decades or more.

They’re still running, either because we failed at our grandiose replacement projects (I’ve seen several of these) or because there simply has been no compelling business reason to replace systems that work just fine.

That is, it really is a feature, not a bug, and it proves once again that no good deed ever goes unpunished.

Here’s who will be punished: You, for not starting to fix the problem several years ago. And it isn’t entirely your fault.

I remember asking in 1994 whether we had any year-2000 problems, when just a few worriers first started to write about the subject. It didn’t matter. We had a tight budget, had just reduced staffing 10 percent to help the company improve its short-term profitability, and had the usual laundry list of urgent projects. The millennium would just have to wait a year or two until it became urgent.

Business has a short-term focus because Wall Street drives business strategy, and Wall Street insists on quarter-by-quarter earnings improvement. Fixing year-2000 software problems adds no new value, so until the problem reached crisis proportions last year, few companies bothered to spare any resources to fix it.

There’s plenty of blame to spread around, but let’s not. Instead, next week, we’ll look at some lessons we can learn from this fiasco.

Management Speak: Your continued input to the document editing process is appreciated.
Translation: I’ll continue to ignore your suggestions.
– Two IS Survivalists, William Allen Simpson and Dan MacNeil, independently arrived at the same conclusion