What do computer viruses and science fiction have in common?

Answer: my recent columns on these two topics both generated lively Forum discussions on InfoWorld Electric. So here’s Round 2 on these subjects, based on the Forum discussions.
My column asking whether the risk of computer viruses was overstated created something of a stir in the Forums. Most participants fell into two camps: those who had dealt with viral infections (I’m wrong) and those who hadn’t (I’m right).

Among the anecdotes, two insights stood out:

Insight #1: Distributed object technologies will create a mess. Traditional computing platforms separate data and executable code, so viruses can only hide in a few, easy-to-detect locations. For the most part you’re safe if you never boot off floppies and avoid downloading executables.

Now we have objects and Objects. (The former are object-like but don’t satisfy the purist’s definition.) The MS Word macro viruses are a pernicious form of object virus, and you ain’t seen nuthin’ yet. From a security perspective the startup macro is an awful feature to build into a word processor or spreadsheet. Every reader of this column should send a letter to Microsoft asking for an installation option on Microsoft Office that disables startup macros for Word and Excel.

True Objects – downloadable Java applets, ActiveX controls, CORBA-compliant code – are worse. It’s as if we had invented metabolism but not antibodies. This is a messy subject. It’s going to take a lot of exploration to fully understand. Right now, all I know for sure is that the situation has the potential to become big-time ugly.

Insight #2: Viruses aren’t that serious a threat because we view them as a serious threat. Translation: virus hype is a self-preventing prophesy.

The whole idea of self-preventing prophesies is an important one. How often have we scoffed at a prophet of doom when doom didn’t come? In the case of viruses, one reason they don’t cause all that much data loss may be because most of us, listening to the hype, have implemented prudent precautions.

Good point. I stand corrected (I think).

David Brin, who writes really good science fiction, wrote an essay describing the idea of self-preventing prophesies. Nuclear war may be one of them. In the ’50s and ’60s, plenty of authors depicted nuclear devastation and post-apocalyptic societies. Their stories strongly influenced society’s perception of nuclear war, which in turn curbed the worst tendencies of those in a position to start one.

Which brings us to reader reaction to my column on science fiction as a better, cheaper alternative to hiring an expensive futurist.

The Forum discussion on the subject was simply huge. A disproportionate number of you, like me, read a lot of science fiction, and it had a powerful influence on our later career decisions.

InfoWorld’s readers are the greatest. D. W. Miller (I infer from the e-mail address), remembered the story describing wearable computers – “Delay in Transit,” by F. L. Wallace, originally published in Galaxy back in ’54, and reprinted in an anthology called Bodyguard in 1960. The computer was implanted, not worn. Not bad for 42 years ago. Miller points out that the computer’s name, “DiManche”, accurately predicted another future technology trend – the capitalization of internal letters in product names.

Michael Croft was the first of several readers to remind me that the series introducing the “replicator” concept was collected in an anthology called The Complete Venus Equilateral, by George O. Smith, published in the 1940s. Think about this: 50 years ago, a science fiction writer discussed the creation of a service economy out of the ruins of one based on manufacturing.

Charles Van Doren has pointed out that post-renaissance society is the first in the history to embrace the concept of progress – that the world can improve. Many of us, having grown up on science fiction, embrace that idea (which, of course, is why we always buy the next software release).

Every silver lining has a cloud inside it.

Every so often I think about taking the bus. I’d like to be socially responsible, reducing our country’s dependence on foreign oil imports and lowering my personal contribution to atmospheric pollution.

Regrettably, my irregular schedule discourages the practice. After 5pm, the bus runs only once an hour, and if you’ve ever stood outside in a Minnesota bus stop in January, you understand the limits of social responsibility – and of centrally-controlled, shared resources.

Yet Larry Ellison, Scott McNealy, and Lou Gerstner all want me to embrace the network computer (NC) which, bus-like, will impose unnecessary and unwanted constraints on me.

The NC concept rests on false assumptions (FAs, to use the technical term). Among them:

FA #1: The NC and Thin Client are related: Nope. “Thin client” is a software architecture. Three-tier client/server architectures partition applications into independent processes handling presentation logic, “business logic” (a catch-all for a variety of different functions) and database services. You can implement thin-client applications on PCs. Heck, you can run everything on the same desktop PC and still have a thin client. It’s software, not hardware.

By the way, do you think “thin client” would be so appealing if they’d called the alternative a “thick client” or “muscular client”? And would it still be a “fat client” if it ran QNX or Linux instead of something flabby like Windows? I doubt it. This isn’t a dialog. It’s name calling.

FA #2: The NC and Java are the same thing: Well of course not. A PC can run Java code just fine. Not only that, but you can upgrade a PC with a better just-in-time Java compiler, language revisions, bug-fixes, and other stuff like that.

FA #3: The PC has huge hidden costs: This takes more than a one-paragraph tirade. A hint: everything written on the subject is a cost analysis, not a cost/benefit analysis, not a cost comparison with alternatives, and not the only analysis that matters: a cost/benefit comparison with alternatives that provide equivalent functionality.

FA #4: Java is portable: Wait a few years. You’ll have new models of the NC with enhanced functionality. You’ll have proprietary extensions. You’ll have competing, incompatible NC architectures. Suppliers differentiate themselves from each other, because if they don’t they’re selling commodities, and commodities have razor-thin margins.

FA #5: The NC has no hidden costs: Once you buy into NCs you’ll need more bandwidth, bigger servers, and more sophisticated network, server, and applications management.

Why do you think IS installs applications on PC hard disks instead of servers now? It’s the bandwidth needed to download DLLs from servers. Download a DLL, download a Java application, what’s the difference? Java doesn’t magically shrink the size and complexity of a word processor or spreadsheet.

And since the NC is simply a paperweight when the network, server, or application crashes, you need better management of all three. Clue: who wants to sell you the servers and management software?

Here’s why the NC won’t succeed. People bought PCs specifically to gain independence from control-oriented IS organizations. The PC freed employees to do what they chose, not what IS decided they should do.

The claimed benefit of the NC is its greatest hidden cost: With it, IS morphs back into the god of software, arrogantly dictating what end-users can and can’t use. If the NC succeeds, end-users will simply re-invent the disconnected PC or independent departmental LAN, so they can do what they want, instead of what IS lets them.

Technology doesn’t only follow business requirements. Sometimes it creates business opportunities, or leads to unexpected business results.

Adoption of personal computer technology in the 1980s led inevitably to employee empowerment. Look at the arguments in favor of the NC and you’ll discover a thinly veiled attempt to disenfranchise employees again.

As is true so often when you peel the onion a bit, it’s enough to make you cry.