According to published reports, Internet Explorer 5.0 will occupy about 50 megabytes of your hard drive.

Now I’m a broad-minded soul, and a firm believer in letting every single adult American make his or her own trip to perdition in his or her own fashion. If Microsoft wants to ship a 50 megabyte browser, bless its heart. If you want to invest a dollar’s worth of hard drive to install it, bless your heart.

And if Microsoft wants to keep on calling a product it packages and ships as a discrete entity for several OS platforms an integral part of the operation system … well, that’s what we have the Department of Justice for, I guess.

Calling it an integral part of the operating system is an interesting enough claim. I enjoy the folks who call it a “thin client” even more, since it isn’t thin and isn’t really a client, either, merely a platform on which the client software can execute.

Blessed with complete ignorance of all facts, I’m confident Microsoft could have provided identical functionality in 25 megabytes had it established compactness and performance as design goals. The code is almost certainly bloated.

But, as Arlo Guthrie once said in a different context, that isn’t what I’m here to talk to you about today.

I’m here to talk about the consumers who will complain about the bloated feature set of the product, proclaiming with great pride that they only use 10% of the features anyway. The inference they’ll want the rest of us to draw is that Microsoft should remove the other 90% of the product.

Bragging that you fail to take advantage of 90% of what a product has to offer ranks right up there with bragging about not knowing how to balance your checkbook — it’s one of the reasons I’ve suggested organizing National Boycott Stupidity Day in previous columns.

So here’s a simple suggestion to all of you who are happy in your 10% mastery of the basic tools with which you perform much of your daily work: LEARN MORE FEATURES!!!

General office software contains lots of features that can make all of us more effective at what we do. When you’re using the computer, any time you find yourself doing the identical thing more than once or getting things out of whack when you make a revision, you’re probably ignoring a useful feature that would do it for you. You may manually number a list, manually retype the name of a section heading when you refer to it elsewhere in the text, manually format bulleted lists, use the space bar to line up right-aligned columns of numbers … do you see a trend emerging?

Extend this insight to the training programs you offer. We go about end-user training all wrong. Most IS trainers teach features. To drive the cliché off a cliff, we give our end-users fish instead of a rod, reel, and bait. Think about it. How much time do your trainers spend on exercises designed to make end-users self-sufficient, able to find solutions for themselves? Probably very little — that’s usually an afterthought at the end of the two-day Basics class. Teach them to poke around in the menus and help system, though, and they’ll learn how to do all sorts of great stuff on their own, including how to not call the Help Desk.

Now here’s where it’s going to hurt. The hardest part of learning these great new techniques is knowing they exist, right? Wouldn’t it be great if the computer watched what you do and, recognizing when you could benefit from one of its features, automatically suggested it to you?

I’ve spent more than a year wanting to punch Mr. Paperclip in the kisser. I still think he’s an obnoxious little cuss. Looking at it objectively, though, (ouch!) I’m forced to conclude (groan!) that he and his compatriots are a valuable addition (eeeeyooow!) to Microsoft’s office suite.

So if you only use 10% of a product’s features, don’t brag about it. Ignorance may be bliss, but it isn’t a virtue.

I was sitting with Moe, Larry, and Curly at lunch the other day (not their real names but I feel an obligation to protect the guilty) when the conversation turned to information technology.

My colleagues (we’ll call them S3 for short) recently left the military, so their perspective on IT is a bit broader than that of most IS professionals. Moe led off with a mention of genetic algorithms. Here’s how these amazing things work: You feed the computer any old airplane wing design (for example) and a definition of what it means for a wing to be optimal. Let the computer churn for a day or two, and just as an automatic bread-maker magically produces bread, it will pop out an aerodynamically perfect wing design.

The algorithm is called “genetic” because it mimics evolution, randomly mutating the design in small increments and accepting those mutations that improve the design. Very cool stuff. If you support an engineering design group, this technology is in your future.

From there, Curly somehow got to artificial intelligence, and in particular the AI golf caddy. Apparently, these little robots actually exist, following you around the golf course and recommending the perfect club for every shot. Larry pointed out the hazards of combining the AI caddy with Y2K: “Carnage on the course,” he called it.

If you haven’t noticed, people are doing amazing things with computers these days. So why is it that most IS departments, in most projects, can’t seem to design a database, create data-entry and transaction-entry screens for it, design and code a bunch of useful reports, and hook it all to the legacy environment without the project going in the ditch?

When I started in this business, a typical big project needed 25 people for three years and was completed about a year after the deadline — if it got completed at all. Compared with the simple compilers we had when I started programming, our integrated development environments should easily make us 100 times more productive. So why is it that as I write this column, a typical big project needs 25 people for three years and is completed about a year after the deadline — if at all?

Do the math, people. One programmer should complete everything in nine months. What’s the problem?

It isn’t, of course, quite that simple. It also isn’t that complicated. Try this: Start with a small but useful subset of the problem. Then, understand the data and design the database. Create edit programs for each table. Work with end-users to jointly figure out what the update transactions are, and design transaction entry screens for each of them. Design a navigation screen that gets you to the edit and transaction screens. Build a simple batch interface to the legacy environment. Do it as fast as you can. Don’t worry about being sloppy — you’re building Quonset huts, not skyscrapers.

Put it all into production with a pilot group of end-users for a month. Turn your programming team into end-users for that period so they experience their system in action first-hand. At the end of the month, start over and do it all again, this time building the system around how the pilot group wants to work. After a month with the new system they’ll have all kinds of ideas on what a system should do for them.

Build Version 2 more carefully, but not too much more carefully because you’re going to loop through the process one more time before you’re done. In parallel with Version 2, though, start building the infrastructure — real-time legacy interfaces, partitioned business logic and so on — that you’ll need for Version 3, the production application that needs a solid n-tier internal architecture and production-grade code.

Does this process work? It has to — it’s just a manual version of a genetic algorithm. I’ve used it on small-scale projects where it’s been very successful, but haven’t yet found anyone willing to risk it on something bigger. Given the risks of traditional methodologies, though (by most estimates, more than 70 percent of all IS projects fail) it almost has to be an improvement.