In what I laughingly call my spare time I teach a graduate course in computer communications at St. Thomas University. Since I’ve been trying to make sense of Virtual Local Area Networks (VLANs), the class got to write a term paper on the subject. I asked everyone to:

1. Provide a clear and coherent definition of the term VLAN.

2. Describe the situations VLANs have been designed to address, contrasting VLANs with the alternatives.

3. Analyze the claimed benefits of VLANs.

So … A VLAN is an administratively-defined rather than physically-defined LAN subnet. With VLANs you can put everyone in a workgroup on the same virtual segment regardless of their physical location. This is supposed to reduce administrative overhead.
It’s also supposed to reduce network traffic. How? Workgroup members communicate with each other more than with anyone else; VLANs keep workgroup traffic within one virtual segment. Also, instead of broadcast traffic propogating throughout the LAN, it gets restricted to “broadcast domains” that correspond to the same virtual segments. (If you’re not a network weenie: while most network packets go between specific nodes on a network, some have a destination address of “everyone” – the sender “broadcasts” them.)

There’s more, but when you’re in IS management you’re not supposed to understand technology in great depth. You’re supposed to understand its nature, purpose, and fit with your organization’s business needs.

You’re also supposed to have a high-quality BS Detector set for maximum scan, since our industry has the highest BS/Customer ratio of any profession. (BS, if you’re not familiar with the term, is short for “BuShwah”.)

I’ve graded 34 graduate papers. I’m concerned VLANs are mostly bushwah, because the problems they’re designed to solve may not be important problems in the first place.

Let’s take assigning workstations to logical workgroups. Most organizations still co-locate workgroups, so a VLAN virtual segment and the network’s physical segmentation would largely coincide. No big VLAN benefit there.

More important, most traffic goes from workstation to server and back, not from workstation to workstation. With everything attached to switching hubs (and VLANs require the use of switching hubs) workstations should see only their own packets even without VLANs, except for those pesky broadcast packets.

How much LAN traffic comes from broadcasts? I can say with complete confidence I have absolutely no idea. I do know this: a lot of broadcast traffic comes from older protocols like Novell’s SAP and RIP (System Advertisement Protocol and Router Information Protocol). Novell, though, has replaced SAP and RIP with NLSP (Netware Link Services Protocol) which dramatically reduces broadcast traffic – a good idea, and one that further reduces the value of VLANs.

How about the reduced network administration from moves, adds and changes? I’m completely baffled here. Network bridges and switches automatically learn the location of every station’s address on the network. Move a station and they learn the new location without intervention.

With VLANs, you get to assign each workstation to a virtual segment manually. Sounds like more work, not less, especially since you already have to define workgroups in your network directory service.

Lots of very smart people believe in the value of VLANs, though, and that makes me wonder what I’m missing. So I’m going to do what any good manager should do: ask an expert to do the hard work, after which I plan to take the credit.

Nick Petreley (you’ll see his smiling face a few pages from here) knows everything there is to know about technology. He and Charlotte Ziems, InfoWorld’s Test Center Director, want to do solutions-based testing anyway. I think VLANs would be a great solution to test.

I’m feeling pretty crabby over the excitement generated by network computers. P.T. Barnum could probably explain it, except that he’s dead. He knew every passing minute results in the birth of one more sucker.

It all comes down to bad cost accounting and dumb measures. PCs cost more to support than more traditional architectures, we’re told: three out of five dollars goes to support costs. We’re also told client/server systems cost three times more to develop than traditional systems.

It may even be true, although as previous columns have pointed out, the chance these comparisons having any meaning is pretty small. Even if it is true, it’s irrelevant.

One reason: companies that focus on cost are doomed. Companies that spend their energy reducing costs forget why they spend that money … to retain and attract customers.

Paul Erlich, writing about species extinction, came up with the metaphor of the rivet-popper, who sat on the wing of an airplane removing rivets. When a passenger complained to the pilot he explained the plane didn’t need that rivet anyway. The proof? The wing is still attached, isn’t it?

Cutting unnecessary costs is a Good Thing (GT to use the acronym). Far too many companies forget the keyword “unnecessary” and figure cutting costs is a GT. It’s not: most costs are investments in customer loyalty and acquisition.

The point: companies need to focus on value, not cost. Does anyone seriously think network computers will provide as much value on the desktop as a full-fledged PC?

And now, a reality check. Network computers will run software downloaded from servers. They’ll be completely compatible across all manufacturers, running the same code with no configuration problems, driver idiosyncrasies, or other technology-generated headaches.

I sure believe this. My 20 years of experience in this industry lead me to believe vendors work together in harmony to produce standards designed to maximize customer value, then resist the temptation to create propriety extensions or non-standard alternatives. Don’t you?

No matter how much grass we smoke, it’s always greener under someone else’s bed.

Let’s put our collective experience with real-world vendors aside, though, and pretend these gadgets really will work as advertised. Our support headaches will evaporate overnight.

So when someone in Accounting creates a critical spreadsheet and saves it before going home in the evening (on the server, since there’s no local storage) IS can be assured the spreadsheet program will run without trouble.

And when that accountant needs to do a bit of work at home because she’s a single Mom and has to put the kids to bed, but she’s on deadline for the next morning, she just pulls the spreadsheet up on her home computer and …

Oops! It runs from the network only. Okay, let’s imagine this technology allows for remotely accessing the corporate network from home. So she dials in and downloads the spreadsheet, so she can work on it on … hold it. She has Excel on her home computer. She bought it herself, of course, because her employer uses network computers. So she tries to load the spreadsheet into Excel and …

Hold it again. Do you think the network computer runs a spreadsheet that’s 100% compatible with Excel? I believe this just as much as I believe WordPerfect and MS Word exchange files while perfectly maintaining all formatting.

Now how about that mobile sales force? Are you going to outfit every one of those guys with a laptop network computer and 10 Mbps wireless network connection?

You think your support costs are high now? Start thinking about a mixed architecture, with network computers on the desktop and standard PCs everywhere else. You haven’t even begun to buy Excedrin in bulk.

So before we all get too excited about network computers, let’s do some serious thinking about how people use personal computers now, and figure out how they’ll react when we tell them they won’t be able to do that stuff anymore.