Neuroscientists use a nifty technique called “Positron Emission Tomography” to map which parts of the human brain process different kinds of thoughts and sensations. I’d bet if we PET scanned some religious fanatics, serious football fans, and the authors of the flames I received in response to my follow-up article on Network Computers a few weeks ago, they’d all be using the same cerebral structures.

Larry Ellison of Oracle coined the term “network computer” and Oracle has an NC reference specification. This is the gadget I argued against in recent columns. The Citrix Winframe may be fabulous. The HDS @workStation may be just the ticket. Last I looked they aren’t built to the Oracle reference spec.

You can call anything you want an NC – it’s a free country (expensive, but free). The companies that took advantage of free publicity by calling their various stuff “NCs” have to take the good with the bad.

One question: since Microsoft’s new license terms only let you run MS applications on MS operating systems, are you sure what you’re doing is legal? It’s debatable whether an NC running an MS application remotely is kosher or not, and Microsoft has better lawyers than God.

Speaking of definitions, I’ll bet lots of readers got excited over my exit line last week: that the opposite of “client/server” is “bad programming”. Got your attention, didn’t I?

Applications are client/server when the developer breaks out different pieces of program logic into independent, portable executables. It isn’t fundamentally different from what we’ve been doing all along with CICS, VTAM and so on, but you may want to draw a distinction. That’s cool: let’s call it client/server only when application partitioning goes beyond operating system and database management utilities to involve at least presentation logic, and maybe business rules and processes as well.

We’ve been breaking these into independently compiled subroutines for years, so why would it suddenly start costing more when we called it “client/server” and making them portable? Answer: we’re confusing several separate issues:

Building to a Platform: COBOL/CICS/3278 programmers build to an existing, stable environment. They’re just writing applications. Lots of client/server projects sink because the team has to build their ship while they’re trying to sail it. Of course it’s going to leak.

Scaling: The IBM mainframe hardware/software architecture has been optimized and refined over the years to handle high-volume batch processing. Lots of client/server projects include a goal of unplugging the mainframe in favor of cheaper MIPS. This is a great goal, and you should go for it if your system won’t include big batch runs. If it will, you’ll have to build in all sorts of nasty workarounds and kludges, and these will inflate project costs unreasonably.

You won’t win the Indy 500 with a freight train, but you also won’t economically haul grain with a fleet of Porsches.

User Interface: We used to build character-based monochrome interfaces that required users to learn both business and technology. Remember training call center agents hundreds of transaction codes?

Employees learn how good an interface can be at their local PC software retailer. They rightfully hold IS to a higher standard now. Surprise! Building GUIs, with lots of interface objects, windowing, and extensive business intelligence, takes more time than building 3278 screens.

Programmer Training: We hire trained COBOL programmers. They learn in trade school or we just say, “3 years of COBOL/CICS experience” in the ad. We ask client/server development teams to learn their tools as they build applications. C’mon folks, what do you expect – perfection on the first try?

So …

When I was a studying fish behavior many years ago, I presented some serious statistics to my research advisor. He said, “This is fine, but what does it mean?”

Ask this question whenever you hear silly average-cost statistics from self-styled industry pundits … except, of course, from yours truly.

If I were starting a business today, I wouldn’t dream of using Microsoft Office.

No, Corel hasn’t bribed me. This is simple economics. Corel has priced its office suite so much lower than Microsoft I can’t imagine enough of an incremental benefit to MS Office to cover the spread.

Very few of InfoWorld’s readers are opening a business today, but many of you have MS Office ’97 staring you in the face. There’s no concurrent use licensing, file formats aren’t backward compatible, and it costs a lot of money compared to its competition.

Besides, many of you have expressed strong interest in the NC, which means you’re open to non-Microsoft applications. If you’re going to make your move, now’s the time.

Last I heard, IS budgets were still pretty tight, so the savings should be pretty interesting to a frugal CIO. “Except,” I can hear some of you thinking (I have mentioned my telepathic abilities, haven’t I?) “software is just a small part of the total cost of PC ownership. So this will just be nibbling around the margins.”

The numbers you hear bandied about on this subject are ridiculous. If you really believe each PC costs you eleven grand a year, put out an RFP. Any number of outsourcers will be delighted to provision your desktops for at least 20% less than that (I’d make the offer myself, but I don’t like lines that long).

So here’s what I’m going to do. Over the next few months, as a bizarre hobby, I’m going to put a better model together. InfoWorld’s Features and Test Labs may join me … we’re still working out the details. The point of this model won’t be the “real” costs. The point will be to help you probe the issues you face managing this resource, so you can make realistic decisions.

Here are the key elements this kind of model needs:

Fixed and Variable Components: Any useful financial analysis separates costs into those that don’t vary with volume and those that do. Analysis often benefits from separating “semi-variable” costs – those that increase in large steps at certain volume increments) – from purely variable costs.

Partitioning of Benefit-driven Expenses and Overhead: Some expenses increase as employees take greater advantage of technology. Training is a good example: the higher your training costs, the more different applications employees know how to use, which means they’re gaining more value from their systems. On the other paw, you have to spend some money before the employee ever turns the system on. Include network connections in this category. Understanding the difference is critical.

Separation of Non-Technical Issues: I’m heartily bored with hearing how much time goes into managing hard disk space, doing backups, and all the related drivel. Take away the PC and employees will spend more time than that handling and filing paper, cleaning out filing cabinets when they get full, and wasting all the other time that gets used with non-technical alternatives. If we include these costs at all it will be as negative numbers, crediting the time saved from doing things the old ways.

Recognition of the Real World: PC cost models pretend we work rigid schedules at hourly rates. Time spent in non-value-adding activities like training is lost to the organization.

It’s a great theory that ignores our day-to-day experience. When most of us take a vacation, our coworkers absorb some of our work and the rest piles up until we get back. We deal with it when we return, and our vacation (or training class, or whatever) costs our employers effectively nothing.

Still believe each PC costs you $11K per year? Call me … I’m selling shares in a bridge crossing the East River, and I’m just sure you’ll want a piece of the action.