When presented a silver lining, some people just aren’t happy until they can point to a nasty dark cloud.

In the case of the year 2000 bug, I wish they’d keep their fat mouths shut.

By every rational measure, the world should look at how it handled the year 2000 problem and congratulate itself for a job well done. If we managed to apply the same practicality, tenacity, and hard work to global warming and ozone depletion, we’d fix the world’s atmosphere in a few years as well and be ready to take on overpopulation.

Instead, dunderheads looking for thunderheads are second-guessing us to death.

Case in point: A recent article by Denis Horgan who, writing for the Hartford Courant, “explains” that the year 2000 problem came about because “the computer people didn’t to their job right in the first place.”

Horgan exemplifies the kind of lout who, unencumbered by knowledge, mistakes blame for accountability. Entirely ignorant of the origins of the problem, he eschews actual research. Instead he relies on name-calling, the first resort of the intellectually lazy, blaming “…the computer world and its claque who were making so much money to fix what they had produced so very badly in the first place.”

The year 2000 problem arose from good design, not bad. The programmers who created the legacy systems that absorbed the bulk of year 2000 remediation spending were faced with permanent storage so costly that depending on when the system went into production and the exact technology used, they saved their employers between $1 and $20 for each and every record with a date field. The money saved dwarfs that spent on remediation.

Of course, anyone who blames a programmer of 1999 for the work of an entirely different programmer working in 1969 can’t be expected to understand a subject like information technology, which requires rigorous logic. Perhaps that’s why Horgan isn’t a programmer.

Horgan isn’t the only storm-cloud-finder among us, of course. So many of the weather-challenged have ridiculed year 2000 doomsayers in print that singling out one for special scorn just wouldn’t be fair. They remind me of someone diagnosed with severe cancer who recovers through the miracles of modern medicine and a lot of just plain luck, only to sneer at his doctors for overstating the severity of his condition.

The reality is that some of the worst doomsayers did overstate the problem, and we should all be grateful. If they hadn’t, too few business executives would have opened their checkbooks enough to fix the problem, and we would have faced real problems instead of only a few minor glitches. As mentioned earlier in this space, the year 2000 bug was a self-preventing prophesy, and we should thank everyone who sounded the clarion call rather than excoriating them.

Even worse than these first two categories of meteorological ninnies, though, are the inevitable complainers who acknowledge that the work was necessary and successful, but wonder if it couldn’t have been done at a lower cost.

Let’s see: IS’s overall project success rate is about 30 percent. The year 2000 remediation success rate was in the high 90s. Conclusion: We overspent on year 2000. Brilliant.

There were lots of reasons year 2000 projects got done on time, not the least of which was that the deadline was an astronomical event rather than an arbitrarily chosen date. Could it be, though, that maybe … just maybe … another important factor was that year 2000 spending rates were right, and that one problem IS has with the rest of its projects is too-tight budgets?

Year 2000 remediation was a success. A big one. When you succeed at something, what you should do is figure out what you did right and apply it elsewhere. The growing orgy of Y2K second-guessing and nitpicking is just plain idiotic.

Awhile back I suggested we organize National Boycott Stupidity Day, attended by invitation only, to counter the growing tendency Americans have to celebrate dumbness as a virtue. The next time you hear someone gripe about either the origins or resolution of the year 2000 problem, send them a copy of this article and let them know, from me … they won’t be invited.

I watched a bit of an infomercial for The Bible Code. It was pretty funny.

Supposedly, the authors have decrypted a complex scheme through which the Bible predicts specific events. The infomercial presented large numbers of past successful predictions as proof. Here’s a shocker: It didn’t include one prediction for the future (at least in the segment I watched).

Predicting the past is easy. The future is tougher.

This month we’ve been reviewing predictions made in this column. Let’s wrap it up.

Non-success of the Network Computer: Larry Ellison’s original idea — that a system using Java for its OS, downloading Java applications from servers for execution would supplant the PC — continues to go nowhere.

New Definition of Network Computing: More a hope than prediction, I described a return to the idea of dynamically assigned, completely portable processes. I’m still hoping (Java’s increasing focus on the mid-tier leads to optimism), but so far, it’s still way to hard to reallocate processes around the network.

Americanization of American Culture: Two years ago I predicted that the Internet would strengthen Americans’ heritage of semi-anarchic individualism. One year ago I presented Jesse Ventura’s election over two empty suits as evidence of the trend. This year an empty suit seems less undesirable. I still think the prediction will be borne out, but it will be accompanied by reinforcement of the natural resentment many Americans have for verifiable information and tight logic. Because the Internet makes fact and fabrication virtually (as it were) indistinguishable, it will let anyone rationalize any nitwit notion at all.

Globalization of American Culture: I also predicted that other cultural influences would diversify Americans’ ideas in some uncomfortable ways, using the greater comfort some other cultures have with erotica as an example. How about it? Abercrombie and Fitch sent out a highly provocative catalog. Critics loudly complained because a scene from Eyes Wide Shut, in which Tom Cruise and Nicole Kidman did the nasty on screen, was digitally altered to hide the details. I could claim victory, but I won’t — this doesn’t appear to be an Internet-based phenomenon. Evidence? Just for giggles I looked for porn on a few search engines. Few of the sites had foreign domain names. I guess we can lay claim to being sex-merchants to the world, and not the other way around.

Internet 1.5: This was my name for enhanced ISP services, such as guaranteed quality of service within an ISP’s network. This seems to be happening, but without much visibility so far. Look for 2000 to be the breakthrough year, and a division of the ISP market into small, low-cost commodity providers and large, value-added networks.

IP Telephony: I still think this will be huge, but disagree with the industry mavens who say it will be “driven by the applications it enables”. What applications? Everything IP telephony offers has been available for years through CTI. What will drive IP telephony is its reduced cost, easier management, and availability from the data vendors with whom IS is most comfortable. What may kill it is the odd prominence Windows NT has as a platform. Telephony requires “five-nines” reliability. NT isn’t a five-nines platform. Look for migration of IP telephony to more reliable platforms. Also look for Lucent, Nortel and Cisco to help Microsoft improve NT’s reliability as they realize what a leaky boat they’ve invested in.

Linux: So far, big success as a server and still a fringe player on the desktop, as foretold. 2000 won’t be the year for Linux on the desktop, either. Has it become “just another UNIX” as I also predicted? Not yet … not yet. But as Linux succeeds, it’s becoming a corporate play and the hobbyists who made it succeed will increasingly find themselves the objects of corporate America’s traditional expression of gratitude … derision and indifference.

And finally …

Y2K Movies: I predicted a glut. All we saw was one made-for-TV movie.

Sometimes, being wrong is better.