HomeIndustry Commentary

The re-birth of client/server computing

Like Tweet Pin it Share Share Email

The next big trend in information technology is client/server computing, only nobody seems to admit it.

History first:

In the early-1990s, client/server was the Next Big Thing in IT. In its earliest form it partitioned applications into database management — the server part — and everything else, which was the client part.

It worked pretty well, too, except for a few glitches, like:

  • Desktop operating systems weren’t ready: These were the days of DOS-based Windows. NT was just emerging as the next generation, with OS/2 alongside it as IT’s we-sure-wish-it-had-a-chance alternative. Client/server computing meant PCs couldn’t just be platforms for enhancing the effectiveness of workgroups and individual employees anymore. They had to be production-grade platforms.
  • Microsoft didn’t respect its DLLs: The phrase was “DLL hell.” What it meant was that Microsoft issued patches that changed the behavior of DLLs in ways that broke applications that relied on them.

Including client/server applications … a headache IT professionals found seriously annoying, and for good reason.

  • Servers proliferated: Client/server partitioned database management from everything else. Soon, IT theoreticians figured out the benefits of further partitioning. The client part of client/server became the presentation layer; the integration logic partition spawned the whole Enterprise Application Integration marketplace; and moving work from one place to another led to workflow systems and then “business process management” (a new name for the same old thing — neither the first nor last time that’s happened in IT).

What was left were the various algorithms and business case handling that constitute core business logic, which either ran on what we ended up calling “app servers” or as stored procedures in the database.

Which in turn meant your average business application needed three or four separate servers plus the desktop. Client/server started out as a simpler alternative to mainframe computing, but it became darned complicated pretty quickly.

As IT’s acceptance of the PC had never been more than grudging, a standard narrative quickly permeated the discussion: The problem with client/server was the need to deploy software to the desktop.

It was the dreaded fat client, and, fat being a bad thing, the UI was moved to the browser, while presentation logic moved to yet another server. The world was safe for IT, if clunky for computer users, who had become accustomed to richly functional, snappily performing “fat” interfaces.

To help them out, browsers became “richer,” the exact same thing except that (1) “rich” is good while “fat” is bad; and (2) nobody had to admit they’d been wrong about anything along the way.

So where are we now? Desktop operating systems are more than robust enough to support production-grade software, Microsoft now respects its DLLs, and we have excellent tools for pushing software to PCs. The original rationale for browser-based computing is pretty much a historical curiosity.

A new rationale arose to take its place, though: Browser-based apps let us develop once and run anywhere. It was a lovely theory, still espoused everywhere except those places that actually deploy such things. Those who have to develop browser-based apps know just how interesting software quality assurance becomes when it requires a lab that’s chock full o’browsers … the bare minimum is three versions each of Internet Explorer, Firefox, Chrome, and Safari, each running on at least three versions of every operating system they’re written for, tested on at least three different screen resolutions.

And now we have tablets, just in time to save the day, because on tablets, browser-based interfaces are rapidly being supplanted by (drum-roll, please) … that’s right, client/server apps.

Oh, that isn’t what they’re called. But in, for example, Apple’s App Store, you’ll find plenty of companies that generate content that’s consumed over the Internet, engage in eCommerce, or both, that are providing free iPad apps that provide a slicker user interface to the same functionality as their websites.

That’s right: The presentation logic is deployed to the iPad or Android tablet as an App; the rest executes on corporate servers. Sounds like n-tier client/server to me.

If you aren’t already deploying custom tablet apps as rich, tailored front-ends to your existing Web-available functionality, you probably have such things on the drawing board. And once you’re back in this business, you might as well move away from browser-based deployment to custom desktop/laptop front-ends as well.

Is it more work? Yes, it is. So here’s a research project that’s tailor-made for someone’s graduate research thesis: Compare how long it takes employees to perform a series of standard tasks on browser-based user interfaces with the time needed using customized clients. My unencumbered-by-any-facts guess is that the custom clients would win, and win by a big enough margin to cover the spread.

Call it what you like, it’s client/server reborn.

Comments (5)

  • I love corporate IT!

    I worked in it for nearly 30 years and one thing that never, ever, changed was the desire to move back to the glass walled data centre days or the battle for control. No ‘new’ ideas but just chasing the user’s demands while always fighting for control. I expect security to become the next big thing for tablets and smart phones… have to protect the data/apps… and so the cycle will continue to repeat as you so clearly explained. CIT never seems to get that its not about control but access. Its about standards not proprietary ‘innovations’. Give up the search for control of apps and data use and instead protect the resources of data and business logic/processes. Use your influence to force things to become more standard and only write to those standards and only support companies who also support them. This simplifies your job so only screen size and resolution are variables for design. Enough with the repeating system architecture design cycle.

    Give CIT their glass house back, and let the renegades run with the data. That is after all, when the most interesting things happened the first time.

  • Yo Master Lewis,

    I agree with almost all of your points — except your final conclusion: that custom application front-ends are as necessary and justified for PCs as they are for tablets/mobile devices.

    – Your history of the rise, fall, and rebirth of client-server: An insightful and concise synopsis.

    – Your analogy between “fat” client apps and custom tablet apps: Makes perfect sense.

    – Your puncturing the “develop once, run anywhere” myth about web publishing: Well deserved iconoclasm.

    – Your suggestion that custom user interfaces serve end users better than browser-based interfaces: Quite a reasonable conjecture.

    – Your comic tone about the software industry’s euphemisms, “buzzword creativity”, amnesia, and denial of past errors: Completely justified by the industry’s laughable habits.

    But I don’t quite buy the final step in your argument. Several of the reasons for the move to custom apps on tablets and mobile devices, don’t apply as strongly to desktops.

    – Most (almost all) browsers were originally designed for desktop use, so their limitations on desktops aren’t as crippling as they are on mobile devices.

    – Most web sites were designed for viewing with desktop browsers, so they don’t suck as badly on desktops as they do on devices with smaller, coarser screens.

    – Desktop computers generally have faster internet connections than most mobile devices do; and one of the motivations for custom mobile apps is to (try to) compensate for the thinner pipes.

    – Thankfully, there are indeed better ways now to publish software updates to PCs; but they still aren’t as easy and cheap for the software buyer as browsing to an updated web site. And while developing and testing for browser compatibility are expensive headaches, they are the app developer’s headaches, not the buyer’s.

    Of course, some desktop user interfaces would benefit enough from the switch from browsers to custom apps to justify the investment. I hope your column provokes enough thought to encourage those investments.

    Overall a good column; thanks.

  • Perhaps reborn, but with the lessons of making the web scale firmly in hand. So I see it as an amalgam of choosing the best solution for the target audience, and the capabilities of the IT and development groups involved. Sometimes a web client is best; sometimes a “rich” web client; sometimes a client application (using http, web services, or other patters that are known to scale); sometimes a combination.


  • Interesting piece, I hadn’t thought too much about tablet apps as the new client/server.

    But now some comments from a guy who has to make this kind of stuff all work. I’ll start by using three of my clients as a reference – they are typical of what we run into these days.

    There are two main platforms – Apple and Android (and, I suppose Blackberry). If you do an app, you need to do both. The cost just went way up.

    Developing an interface for a table can be faster, but if you have Front-end Web Developers who know their craft, the tablet vs. web app development advantage slips away. Plus with tablet apps, I (again) have to worry about deployment, app version proliferation, and backward compatibility.

    Because of Firefox’s and Chrome’s auto-update feature, IE is the only browser that we’ve really had to regression test and we have third-party apps that make that task easy and efficient. Does it take extra time? Absolutely.

    My clients won’t spend the money on tablet apps. Short-sighted? Maybe. They argue (and we often agree) that not enough of their target audiences have tablets. For example, people over 50, external dealer networks, and sales reps (notoriously behind the tech curve). Clients are usually not in any position to dictate tablet use to their independent dealer and client networks.

    It’s not one or the other (tablet vs. web). Right now it HAS to be both and that increases development costs which no client (yet) has chosen to pay.

    I can create web apps that look great on a tablet AND a desktop using Responsive Design (CSS) techniques. Yes, it’s not the same thing but right now paying clients don’t care.

    I see the emerging possibility that tablet apps could replace a lot of web-based apps, but we’re not there yet. If you can control your environment, tablet apps are a good alternative. An example would be creating a tablet app for your own sales reps. If you can get them all on tablets (preferrably the same one), you’d have a winning solution (sales reps are suckers for anything with a high gee-wiz-bang factor :-))

  • Right on the mark. I think the proliferation of App Stores on legacy operating systems is only going to accelerate this trend as time goes by.

Comments are closed.