HomeCloud

Lost and found

Like Tweet Pin it Share Share Email

Tell me how this makes sense.

Sitting on every corporate desktop is more computing power than existed in the average 1980 data center.

We’re going to use that computing power to run a browser and nothing but a browser. Through a complex combination of networking, virtualization, elaborate security provisions, and even more elaborate management and invoicing systems, the browser will deliver an office suite that runs on a server located in someone else’s gimongous data center.

Through the magic of grid computing, though … the same core technology used to create the Cloud’s highly elastic computing infrastructure … end-users can donate their increasingly unused desktop CPU cycles to the Search for Extraterrestrial Intelligence (SETI).

Why would you want to pay for CPU cycles that lie outside your firewall when you have an enormous supply of them going to waste inside it? Exploiting them wouldn’t even require sophisticated new technologies. DLL hell is a distant memory. SOA provides location independence and run-time binding. Automated version detection and updating technologies are routine.

So why not put the database server and storage systems in the data center, managing business logic centrally but executing it (except for batch runs, of course) on the desktop?

What am I missing?

Probably a lot. Large numbers of very smart people are investing a great deal of time, energy and capital into the Cloud, while nobody at all appears to be thinking about the value of desktop CPUs, other than a few organizations that want free computing power. What are the chances I’ve spotted something obvious that all these experts have missed?

The answer, sadly, is “Higher than they should be.” And while the economics and architecture of desktop CPU utilization is probably an academic matter so far as you’re concerned, understanding how this could happen matters to you quite a lot.

My personal policy is that when a very large number of inordinately smart people spend all day every day concentrating on a topic, I shouldn’t figure I’m so brilliant I can outthink them all in what I laughingly call my spare time. That doesn’t mean they’re right. It means the odds aren’t in my favor if I disagree with them. This is especially valid when dealing with professional, independently funded scientists.

Depressingly, many scientists on corporate payrolls have proven themselves willing to fudge their findings to favor their employers’ economic well being. Money distorts decision-making in much the way gravity distorts space-time. It pulls thinking to where it is most concentrated.

When considering the popularity of the Cloud and the current lack of interest in using desktop computers to, for example, compute, this econo-gravitational effect might be at work.

Consider: In the mid-1980s, most of the industry’s creative energy was focused on the personal computer. A few PC software startups had made huge piles of money; venture capitalists wanted a share; the best and brightest among the developer community found employment in companies trying to join the bandwagon; and client/server computing was born.

A lot of the thinking behind client/server was a bit fuzzy, though, including what words like “client” and “server” were supposed to mean. Added to the fuzzy thinking were tools too primitive to easily and flexibly segregate and deploy business logic, integration logic, and user presentation logic. Layered onto this was Microsoft’s refusal to respect its own DLL definitions, which led to “DLL Hell” when IT organizations deployed new software and new versions to the desktop.

Which is why client/server computing got a bad name and the industry’s creative energy and capital investment shifted to server-side computing.

Then the World Wide Web happened and it all shifted again, to eCommerce.

And so on, ad infinitum, ad nauseum.

While it’s entirely possible that sound engineering arguments lie behind a preference for moving computing cycles into the Cloud and paying for them there, while ignoring the free ones sitting inside the firewall and directly in front of the end-users who need the results, it’s just as possible the arguments are entirely economic:

  • The big research firms are publishing their traditional hockey-stick-shaped growth projections, this time for the Cloud.
  • These growth projections become self-fulfilling prophesies (at least in the short term) as those responsible for business investment use them to decide where to place their bets.
  • Technical professionals, needing jobs, find them where the investments are being made.
  • And so, the money trail, rather than engineering factors, pull our industry into the Cloud, for no particular reason.

Save one: The secure feeling that comes from following the latest trend.