Back when the earth was young and dinosaurs … uh, batch mainframe processing … ruled the world, you picked a primary vendor, usually IBM or Digital, and your vendor defined your technical architecture.

IBM’s whole business strategy, in fact, revolved around control of the architecture. Even if you bought an Amdahl mainframe, Memorex controllers and Telex terminals IBM still defined the architecture. “Technical architecture management” meant buying and upgrading the components IBM mapped out.

Then IBM missed the boat on LANs, or maybe it caught the boat while the rest of us went to the airport. SAA failed and we replaced our SNA WANs with TCP/IP. Open systems and rampant multivendorism has given you control of your own architecture. Now you have to manage it yourself. Be careful what you ask for …

Yes, we’re back to developing our integrated IS plan. An integrated IS plan, you’ll recall, covers three topics: Company Goals, Technical Architecture, and Human Factors. We covered company goals in June, discovering the company’s strategic, tactical, and operational goals in the process and how to translate them into systems concerns.

Now it’s time for technical architecture.

One of the hardest ideas to get right in defining technical architecture is thinking at the right level. Technical architecture describes your systems environment in terms of functional building blocks and how they interact, not specific items and wiring. If “functional building block” isn’t too clear a term, an example may help. “Systems to help us manage our information,” is too vague to be of any use, while “Sybase running on mirrored AIX servers,” has too much detail, locking you into a specific technology.

For data management, you want to say something like, “We’ll support a mainframe RDBMS and a distributed object-relational database. We’ll also support whatever other data management systems are built into our legacy environment but will take advantage of opportunities to align them with our preferred architecture. Under some circumstances we may also choose to accept non-conforming technologies built into turnkey or packaged solutions, but will give preference to those conforming to our technical architecture.”

Technical architecture divides into three major layers: Platforms, Information, and Applications. Taking the last first, applications are what deliver business value. They’re the point of it all, because the rest of the company interacts with applications, not with information or platforms, except for the PC’s keyboard and monitor and the telephone’s handset and touchtone pad located on each employee’s desk. The company goals you developed in the previous section of your integrated plan drive changes to your portfolio of applications.

Information includes everything your applications chew on to produce results. The subject includes databases, word processing documents and spreadsheets, scanned images, Web pages (whether stored in HTML or dynamically generated) … that kind of stuff.

Object-oriented (OO) technology doesn’t change this, even though OO designs wrap data and programs (methods) together into tight bundles. You still manage code with OO programming, and you still have a persistence layer to store every instance of an object … and it’s the information that’s unique in every instance of the object, not the methods.

Don’t count your database management systems in your information layer. A DBMS is a platform along with every other piece of hardware and software you use to build applications and manage information. The platform layer includes host computers, servers, operating systems, DBMSs, all networking equipment, your PBX, and the software you use to manage it all.

The split between the systems management programs that belong in the platform layer and business processing that belongs in the application layer can seem fuzzy. The rule is: If it delivers direct business value it’s an application. If it helps provide computing resources, it’s a platform.

Over the next month we’ll look, layer by layer, at how to manage your technical architecture. And you thought you were having fun now.

In July of 1973 I returned from a semester abroad in Guatemala to find that gasoline was in short supply, prices had tripled, and if you wanted to fill your tank you had to wait in line.

A long line.

Then some wise guy started rumors of a toilet paper shortage. Predictably, huge crowds of worried consumers descended on supermarkets around the country like hordes of locusts on wheat crops, snarfing up every package of the stuff they could, stockpiling this vital commodity against the predicted dearth.

There was, of course, no shortage. The expectation, though, had the same impact as a real one, although for a shorter time.

Employers perceive the existence of a serious shortage of IT professionals right now. So why do so many give the employees they have so little reason to stay?

We’re all nuts. As evidence, the June 29 issue of Business Week, citing the Bureau of Labor Statistics, said that over the past decade, programmers’ pay has lost 1.5% to inflation. Here’s a hint to all you capitalist geniuses out there who run our companies: The law of supply and demand says that if something is in short supply and high demand, prices go up or we get a shortage.

If there really is a shortage, shouldn’t companies be trying to reduce turnover by treating employees better and paying them more? It’s more affordable than spending the full year’s salary plus benefits it generally costs to replace each employee who leaves.

Maybe this means there is no shortage. The statistics cited to demonstrate the shortage show that while 95,000 new IT jobs will be created this year, only 25,000 new computer science majors will graduate.

Inferring a shortage from this data turns out to be wrong. I’m indebted to fellow Perot Systems-ite Robert Fendley for pointing me to the evidence – research by Norman Matloff at the University of California at Davis (check out http://heather.cs.ucdavis.edu/itaa.real.html for more details).

Matloff’s research is revealing. It turns out that about 25 percent of today’s IT workers have computer science degrees. Now let’s see … 25,000 computer science graduates divided by 95,000 new jobs comes to … well I’ll be hornswoggled! We’re in exactly the same shape we’ve always been.

What a surprise. Want to quadruple the number of qualified applicants? If you’re screening out applicants who lack computer science degrees, you have an easy solution. (Something to ponder: Since most hiring managers lack computer science degrees themselves, does this mean they wouldn’t give themselves an interview?)

A lot of our shortage is self-inflicted. The absolutely stupid practice of requiring computer science degrees, which causes HR to keep three-quarters of your potential workforce away from you, is the just the most obvious example. (Memo to our competitors: Please keep on doing this. Thanks.)

Here’s another example of how most of the problem stems from our own ridiculous expectations: Many of us hire “only top-quality applicants.”

One of 10 IS professionals I’ve known were top quality. That isn’t surprising, though, since I define “top quality” as being among the upper 10 percent. The entire workforce could double in ability and we’d still have a shortage of top-quality people.

I’m in favor of hiring great people, but you have to be realistic. Want to hire only the best? Pay top dollar and create great working conditions. The best can afford to be very choosy.

Your alternative: Hire some of the best. Also hire some journeymen programmers, and implement great processes so they can maximize their contribution to your success. And be willing to train promising applicants who have the right aptitude and attitude, understand the business, and want to learn technology.

I’ve heard from an awesome number of IS Survivalists whose backgrounds are in mathematics, physics, chemistry, the military, anthropology, international studies, or clerical work. Despite their lack of computer science training, they are successful IS professionals.

Often they are more successful than their computer science co-workers, in fact, because these supposedly less-qualified people acquired their skills solving real-world problems and stayed in the field because they showed both an aptitude and an affinity for the work.

Why, oh why, do so many companies deliberately ignore people like this?