Technology … all successful technology … follows a predictable life cycle: Hype, Disillusionment, Application.

Some academic type or other hatches a nifty idea in a university lab and industry pundits explain why it will never fly (it’s impossible in the first place, it won’t scale up, it’s technology-driven instead of a response to customer demand … you know this predictable litany of nay-saying foolishness).

When it flies anyway, the Wall Street Journal runs an article proclaiming it to be real, and everyone starts hyping the daylights out of it, creating hysterical promises of its wonders.

Driven by piles of money, early adopters glom onto the technology and figure out how to make it work outside the lab. For some reason, people express surprise at how complicated it turns out to be, and become disillusioned that it didn’t get us to Mars, cure cancer, and repel sharks without costing more than a dime.

As this disillusionment reaches a crescendo of I-told-you-so-ism, led by headline-grabbing cost-accountants brandishing wildly inflated cost estimates, unimpressed professionals figure out what the technology is really good for, and make solid returns on their investments in it.

Client/server technology has just entered the disillusionment phase. I have proof – a growing collection of recent articles proclaiming the imminent demise of client/server computing. Performance problems and cost overruns are killing it, we’re told, but Intranets will save it.

Perfect: a technology hitting its stride in the Hype phase will rescue its predecessor from Disillusionment.

What a bunch of malarkey.

It’s absolutely true that far too many client/server development projects run way over the originally estimated cost. It’s also true that most client/server implementations experience performance problems.

Big deal. Here’s a fact: most information systems projects, regardless of platform, experience cost overruns, implementation delays, and initial performance problems, if they ever get finished at all. Neither the problem nor the solution has anything to do with technology – look, instead, to ancient and poorly conceived development methodologies, poor project management, and a bad job of managing expectations.

I’m hearing industry “experts” talk about costs three to six times greater than for comparable mainframe systems – and these are people who ought to know better.

I have yet to see a mainframe system that’s remotely comparable to a client/server system. If anyone bothered to create a client/server application that used character-mode screens to provide the user-hostile interface typical of mainframe systems, the cost comparison would look very different. The cost of GUI design and coding is being assigned to the client/server architecture, leading to a lot of unnecessary confusion. But of course, a headline reading, “GUIs Cost More than 3278 Screens!” wouldn’t grab much attention.

And this points us to the key issue: the client/server environment isn’t just a different kind of mainframe. It’s a different kind of environment with different strengths, weaknesses, and characteristics. Client/server projects get into the worst trouble when developers ignore those differences.

Client/server systems do interactive processing very well. Big batch runs tend to create challenges. Mainframes are optimized for batch, with industrial-strength scheduling systems and screamingly fast block I/O processing. They’re not as good, though, at on-line interactive work.

You can interface client/server systems to anything at all with relative ease. You interface with mainframe systems either by emulating a terminal and “screen-scraping,” by buying hyper-expensive middleware gateways (I wonder how much of the typical client/server cost over-run comes from the need for interfaces with legacy systems?), or by the arcane issues of setting up and interfacing with LU2 process-to-process communication.

And of course, the development tools available for client/server development make those available for mainframes look sickly. Here’s a question for you to ponder: Delphi, Powerbuilder and Visual Basic all make a programmer easily 100 times more productive than languages like Cobol. So why aren’t we building the same size systems today with 1/100th the staff?

The answer is left as an exercise for the reader.

An ongoing debate fostered by Stewart Alsop rages over when we’ll unplug the last mainframe. (Does this mean there are debates alsoped by Ed Foster? Inquiring minds want to know.)

Back in the good old days, microcomputers processed eight bits, minicomputers sixteen, and mainframes thirty-two. Then progress happened. The laptop computer I’m using to write this column has more raw processing power, and even with lowly Windows/95 crashes less often than the IBM 360/158 I used in 1980.

Stewart has concluded we’ll never unplug the last mainframe. I’m forced to agree, because mainframe isn’t a class of technology, it’s a state of mind. The mainframe mentality – central control – has gained renewed popularity.

Sherman, set the Wayback Machine for 1980. Apple computer dominates the fledgling personal computer market with a 6502 microprocessor, a 40-column screen, and VisiCalc. Accountants flock to this puppy. Why? Because it makes them independent of Data Processing, that’s why.

Well, progress has overtaken us:

  • Various forms of .ini files have made it impossible for end-users to be self supporting, just as fuel injection spelled the end of home car care.
  • Local Area Networks means our formerly independent systems now plug into a shared resource, and we may even load software from central file servers.
  • Electronic Mail and shared directories mean we ship files back and forth, which in turn means we have to agree to common file formats.

Progress is just dandy. In this case it means more powerful systems that are easier to use and provide more value than ever before. The price?

The combination of interconnectedness and maintenance complexity has given central IS a logical reason to regain the control it lost when PCs hit their growth curve in the mid-1980s.

Many IS departments now forbid end-users from loading software into their PCs – only IS-approved standards may be used. That’s fine if IS has a standard – if your employer uses WordPerfect, why should you insist on using WordPro? – but it makes no sense when IS provides no tool and forces users to do without.

Another example of the trend: Not all that long ago, I heard several senior IS executives talk about the importance of getting control over all the “hidden code” that had come into being over the past ten years in their enterprises. The code in question? Formulas in spreadsheets.

Yes, these people seriously believed it would be in their companies’ best interests if IS gained control over the formulas embedded in the various and sundry spreadsheet models employees had created to help them do their jobs.

Why? Two reasons. First, some spreadsheets go into production, serving as crude database management systems that keep track of departmental information. Second, IS supposedly has a far better understanding of how to create consistent “business rules” in ways that encourage code re-use and logical consistency than the end-users who keep on re-inventing the wheel in the various spreadsheets they build.

While clearly absurd (why IS should have any more to say about the contents of an electronic spreadsheet than it does over one created with graph paper, pencils and calculators is beyond me) the trend back to central control is gaining force.

Yes, it’s absolutely true that end-users use spreadsheets to manage databases, using the wrong tool for the job and creating maintenance headaches downstream. I use a screwdriver to open paint cans, for that matter. There are no “Paint Can Tool Police” to stop me, and if I bend the screwdriver, that’s my business.

Duplication of effort is a price companies pay for empowered employees who act independently. Inconsistent spreadsheet formulas are simply the electronic consequence of diverse perspectives about the business.

And IS isn’t all that good at consistency. It manages multiple databases. Equivalent fields in different databases usually have different formats, inconsistent values, and often, subtle differences in the semantics of their definitions.

The personal computer was a key enabler of employee empowerment. Resist the trend back to mainframes. Give end-users as much freedom as you can.