Party on, dudes! – Bill and Ted.
From the IS Survival Mailbag …
My recent column on the Year 2000 raised both the ire and scholarship of the IS Survivalist community.
Quite a few readers took me to task for proposing that both the “decadist” camp (the millennium ends on the decade boundary … December 31, 1999) and the “centurist” camp (December 31, 2000) have legitimate claims.
A few disagreed with my fundamental premise, insisting 1990 was the last year of the 1980s. To them I respond, “thlppp!” Never let it be said I stray from the high road.
Others explained that decades don’t matter — since there was no Year 0, 2,000 years won’t have passed until midnight, December 31, 2000. Jim Carls wrote to explain why they’re wrong:
“If you look it up in your history book, do the math and assume that historical accuracy is of some importance in defining the start of the Third Millennium Anno Domini, the latest point at which the millennium could start was in 1997 (Herod the Great died in 4 BC). And, according to Stephen Jay Gould (interviewed last week on PBS), the latest possible date was October 23rd. Let’s all bring that up in the next planning meeting!”
I say we start celebrating December 31, 1999 and don’t stop until January 1, 2001. Which means the real Year 2000 crisis will be a severe grape shortage. Vineyards … start planting!
Another group of correspondents took issue with the idea that the two-digit year was a feature, rather than a bug, and that it made good business sense at the time. Quite a few e-mails pointed out that a four-byte integer field could have stored 179 years worth of dates avoiding the problem for some time to come. Others questioned how much money programmers saved by not using a four-position year.
The first group would be right if storage were the only issue. Backtrack 25 years, though, and figure out how many iterations of Moore’s Law we have to undo. Computers had, I’d guess, about 1% of today’s processing power. The computation time needed to convert dates to and from integer format would have greatly extended batch processing times, which would have been very expensive. Tim Oxler invites everyone to visit a Web page he put up to discuss this in more detail: http://www.i1.net/~troxler/html/space.html.
The second group raises an interesting question. Leon Kappelman & Phil Scott answer it at http://comlinks.com/mag/accr.htm. Short version: The savings have been huge, far in excess of even the largest Year 2000 cost estimates.
And then there’s the other point — my contention that the world will muddle through as usual, neither blowing up nor sailing through unscathed. Robert Nee wrote to formulate this more precisely. He points out that the basic laws of supply and demand in a market-based economy predict that for every company that goes bankrupt due to Year 2000 problems there will be others that pick up the slack, both in terms of supplying goods and services, and in terms of employment.
This is a wonderful insight. Yes, lots of companies will fail. Yes, lawyers will file trillions of dollars worth of lawsuits, bayoneting the wounded to make sure as few companies recover as possible. (To the gathering flock of vultures now soliciting Year 2000 whistleblowers I’d like to make a simple comment. I’d like to, but I’m not sure libel laws permit it.)
In the end, though, demand will drive supply and so long as whole industries don’t fail, suppliers that are Year 2000 compliant will buy the bloody remains of those that aren’t, providing enough supply to satisfy demand and enough employment to keep everyone working as they do so.
Which, in turn, hearkens back to another point made frequently here: Many of the best investments in IT are those focused on your company’s survival, whether they’ll deliver measurable returns or not.