The next big trend in information technology is client/server computing, only nobody seems to admit it.

History first:

In the early-1990s, client/server was the Next Big Thing in IT. In its earliest form it partitioned applications into database management — the server part — and everything else, which was the client part.

It worked pretty well, too, except for a few glitches, like:

  • Desktop operating systems weren’t ready: These were the days of DOS-based Windows. NT was just emerging as the next generation, with OS/2 alongside it as IT’s we-sure-wish-it-had-a-chance alternative. Client/server computing meant PCs couldn’t just be platforms for enhancing the effectiveness of workgroups and individual employees anymore. They had to be production-grade platforms.
  • Microsoft didn’t respect its DLLs: The phrase was “DLL hell.” What it meant was that Microsoft issued patches that changed the behavior of DLLs in ways that broke applications that relied on them.

Including client/server applications … a headache IT professionals found seriously annoying, and for good reason.

  • Servers proliferated: Client/server partitioned database management from everything else. Soon, IT theoreticians figured out the benefits of further partitioning. The client part of client/server became the presentation layer; the integration logic partition spawned the whole Enterprise Application Integration marketplace; and moving work from one place to another led to workflow systems and then “business process management” (a new name for the same old thing — neither the first nor last time that’s happened in IT).

What was left were the various algorithms and business case handling that constitute core business logic, which either ran on what we ended up calling “app servers” or as stored procedures in the database.

Which in turn meant your average business application needed three or four separate servers plus the desktop. Client/server started out as a simpler alternative to mainframe computing, but it became darned complicated pretty quickly.

As IT’s acceptance of the PC had never been more than grudging, a standard narrative quickly permeated the discussion: The problem with client/server was the need to deploy software to the desktop.

It was the dreaded fat client, and, fat being a bad thing, the UI was moved to the browser, while presentation logic moved to yet another server. The world was safe for IT, if clunky for computer users, who had become accustomed to richly functional, snappily performing “fat” interfaces.

To help them out, browsers became “richer,” the exact same thing except that (1) “rich” is good while “fat” is bad; and (2) nobody had to admit they’d been wrong about anything along the way.

So where are we now? Desktop operating systems are more than robust enough to support production-grade software, Microsoft now respects its DLLs, and we have excellent tools for pushing software to PCs. The original rationale for browser-based computing is pretty much a historical curiosity.

A new rationale arose to take its place, though: Browser-based apps let us develop once and run anywhere. It was a lovely theory, still espoused everywhere except those places that actually deploy such things. Those who have to develop browser-based apps know just how interesting software quality assurance becomes when it requires a lab that’s chock full o’browsers … the bare minimum is three versions each of Internet Explorer, Firefox, Chrome, and Safari, each running on at least three versions of every operating system they’re written for, tested on at least three different screen resolutions.

And now we have tablets, just in time to save the day, because on tablets, browser-based interfaces are rapidly being supplanted by (drum-roll, please) … that’s right, client/server apps.

Oh, that isn’t what they’re called. But in, for example, Apple’s App Store, you’ll find plenty of companies that generate content that’s consumed over the Internet, engage in eCommerce, or both, that are providing free iPad apps that provide a slicker user interface to the same functionality as their websites.

That’s right: The presentation logic is deployed to the iPad or Android tablet as an App; the rest executes on corporate servers. Sounds like n-tier client/server to me.

If you aren’t already deploying custom tablet apps as rich, tailored front-ends to your existing Web-available functionality, you probably have such things on the drawing board. And once you’re back in this business, you might as well move away from browser-based deployment to custom desktop/laptop front-ends as well.

Is it more work? Yes, it is. So here’s a research project that’s tailor-made for someone’s graduate research thesis: Compare how long it takes employees to perform a series of standard tasks on browser-based user interfaces with the time needed using customized clients. My unencumbered-by-any-facts guess is that the custom clients would win, and win by a big enough margin to cover the spread.

Call it what you like, it’s client/server reborn.