I was sitting with Moe, Larry, and Curly at lunch the other day (not their real names but I feel an obligation to protect the guilty) when the conversation turned to information technology.

My colleagues (we’ll call them S3 for short) recently left the military, so their perspective on IT is a bit broader than that of most IS professionals. Moe led off with a mention of genetic algorithms. Here’s how these amazing things work: You feed the computer any old airplane wing design (for example) and a definition of what it means for a wing to be optimal. Let the computer churn for a day or two, and just as an automatic bread-maker magically produces bread, it will pop out an aerodynamically perfect wing design.

The algorithm is called “genetic” because it mimics evolution, randomly mutating the design in small increments and accepting those mutations that improve the design. Very cool stuff. If you support an engineering design group, this technology is in your future.

From there, Curly somehow got to artificial intelligence, and in particular the AI golf caddy. Apparently, these little robots actually exist, following you around the golf course and recommending the perfect club for every shot. Larry pointed out the hazards of combining the AI caddy with Y2K: “Carnage on the course,” he called it.

If you haven’t noticed, people are doing amazing things with computers these days. So why is it that most IS departments, in most projects, can’t seem to design a database, create data-entry and transaction-entry screens for it, design and code a bunch of useful reports, and hook it all to the legacy environment without the project going in the ditch?

When I started in this business, a typical big project needed 25 people for three years and was completed about a year after the deadline — if it got completed at all. Compared with the simple compilers we had when I started programming, our integrated development environments should easily make us 100 times more productive. So why is it that as I write this column, a typical big project needs 25 people for three years and is completed about a year after the deadline — if at all?

Do the math, people. One programmer should complete everything in nine months. What’s the problem?

It isn’t, of course, quite that simple. It also isn’t that complicated. Try this: Start with a small but useful subset of the problem. Then, understand the data and design the database. Create edit programs for each table. Work with end-users to jointly figure out what the update transactions are, and design transaction entry screens for each of them. Design a navigation screen that gets you to the edit and transaction screens. Build a simple batch interface to the legacy environment. Do it as fast as you can. Don’t worry about being sloppy — you’re building Quonset huts, not skyscrapers.

Put it all into production with a pilot group of end-users for a month. Turn your programming team into end-users for that period so they experience their system in action first-hand. At the end of the month, start over and do it all again, this time building the system around how the pilot group wants to work. After a month with the new system they’ll have all kinds of ideas on what a system should do for them.

Build Version 2 more carefully, but not too much more carefully because you’re going to loop through the process one more time before you’re done. In parallel with Version 2, though, start building the infrastructure — real-time legacy interfaces, partitioned business logic and so on — that you’ll need for Version 3, the production application that needs a solid n-tier internal architecture and production-grade code.

Does this process work? It has to — it’s just a manual version of a genetic algorithm. I’ve used it on small-scale projects where it’s been very successful, but haven’t yet found anyone willing to risk it on something bigger. Given the risks of traditional methodologies, though (by most estimates, more than 70 percent of all IS projects fail) it almost has to be an improvement.

I’m jealous of political pundits. They get to write about global warming, economic protectionism, and cloning (for example) without any particular expertise in the relevant disciplines, influencing public policy in the process.

Not me. All I get to write about is how to run IS. My writing about Zippergate, for instance, would be about as classy as the Hollywood actors who make political speeches during the Academy Awards (although that may be preferable to the current practice of thanking everyone in the telephone directory).

That’s why I’m so thankful for Department of Justice vs. Microsoft. It lets me comment on a major issue of public policy without straying from my purported area of expertise. Here’s my comment: As of this writing, Microsoft seems to be counting on one of three legal possibilities: Either 1) It isn’t really a monopoly because even though it is right now it won’t be forever so that makes everything OK; 2) the antitrust laws are a bad idea so don’t enforce them; or 3) even if it’s found guilty, the penalty won’t hurt very much. The “consumers haven’t been hurt” argument is, of course, irrelevant grandstanding – the question is one of using its monopoly for competitive advantage, not one of price-gouging.

The biggest impact of this trial, of course, is all the free publicity it’s given Linux. Ain’t irony grand?

Last week’s column presented a simple formula for predicting the success and failure of new technologies and technology products, using past products as examples. This week we’ll apply it to the current technology scene, starting with Linux.

The formula, you’ll recall, was customers/affordability/disruption – a successful new product must do something worthwhile for the customers (the people making the buying decision, as opposed to the consumers, who use the product); it must be affordable; and it must not disrupt the current environment. (Disruption, by the way, is a big reason companies like Microsoft, which dictate the architecture in particular environments, have the incumbent’s huge advantage.) Let’s start predicting!

Linux (as a server) – Main customers: Webmasters and underfunded system administrators. Benefit: Runs reliably and fast. Affordability: Free, or nearly so. Disruption: As a Web server or file-and-print server, it integrates invisibly into the network (unless Microsoft can make it disruptive through proprietary server-side innovations like Active Server Pages). Score: Perfect – Linux is a winner.

Linux (on the desktop) – Main customers: End-users. Benefit: Fast, less-crash-prone PC. Affordability: Free except for the (not overwhelming but not trivial) time needed to learn it. Disruption: The office suites for Linux don’t reliably read and write the Microsoft Office file formats – that is, there’s a significant delay before the Linux suites catch up to each new Office release, and even then they’re glitchy. Score: Iffy.

Personal Digital Assistants (PDAs) – Main customers: End-users. Benefits: Lightweight, carries around essential information, IS doesn’t get to say what users can and can’t do with it. Affordability: Very. Disruption: Unless an installation goes south, they’re invisible to IS. Score: Perfect – PDAs are a winner.

XML – Main customers: IS developers. Benefits: Universal, awesomely adaptable file format. Affordability: Open standard, learnable without needing tensor calculus. Disruption: A complicated question. As meta data (think data dictionary on steroids) there’s no significant installed base to disrupt. As an office-suite file format, either Microsoft’s XML tags will do the job for everyone or it won’t get off the ground. As a replacement for HTML it’s highly disruptive until it’s built into the browser. Then it’s nondisruptive. Score: Very High – XML has enough different applications that it’s bound to succeed in at least some of them.

Java – Main customers: IS developers. Benefits: Nicely designed object-oriented language, automatic garbage collection, possibly portable (the jury’s out on this one). Affordability: As affordable as any other programming language. Disruption: A complicated issue. For established developers it’s disruptive – having some of a product in Java and the rest in a compiled language is messy, and probably won’t happen. For new developers it’s non-disruptive, except for the performance issues compared to any compiled language. Score: Adequate – Java will become just another programming language.

Well, there you have it. Like a good math text, you now have a formula and examples. Go forth and prognosticate.