“Haven’t you read Amazon’s and Microsoft’s recent press releases on this?”

This was in response to a challenge to the “save money” argument for migrating applications to the public cloud.

I understand just as well as the next feller that press releases serve a valid purpose (what’s the feminine of “feller” anyway?). When a company has something important to announce, press releases are the more than 140 characters explanation of what’s going on.

That’s in contrast to the difference between facts (“We’re changing our pricing model”) and smoke (“You’ll save big money”). I say smoke because:

First and foremost, Fortune 500-size corporations that can’t negotiate pricing for servers and storage comparable to what Amazon and Microsoft pay for the gear they use to run AWS and Azure just aren’t trying very hard. They have access to the same technology management tools, practices, and talent, too.

Second: Smart companies are building their new applications using cloud-native architectures — SOA and microservices orientation; multitenancy; DevOps-friendly tool chains that automate everything other than actual coding, and so forth (“and so forth” being ManagementSpeak for “I’m pretty sure there’s more to know, but I don’t know it myself”).

But migrating to cloud-native architectures that are easily shifted to public or hybrid clouds is quite different from migrating applications designed for data-center deployment. And it’s the latter that are the ones that are supposed to save all the money.

Sure, applications coded from non-SOA, non-microservices, non-multi-tenant designs can probably be recompiled in an IaaS environment. But once they’ve been recompiled they’ll probably need significant investments in performance engineering to get them to a point where they aren’t unacceptably sluggish.

Oh, one more thing: Moving an application to the cloud means stretching whatever technologies are used for application and data integration through the firewall and public network that now separates public-cloud-hosted applications to those that have yet to be migrated.

Based on my admittedly high-level-only understanding, not even all enterprise service buses can achieve high levels of performance when, instead of moving transactions around at wire or backplane speeds, they’re now limited to public networking bandwidths and latencies.

Complicating integration performance even more is the need to integrate applications hosted in multiple, geographically disbursed data centers, as would be the case when, for example, a company migrates to, say, Salesforce for CRM, internal development to Azure, and financials and other ERP applications to Oracle Cloud.

For many IT organizations, integration is enterprise architecture’s orphan stepchild. Lots of companies have yet to replace their bespoke interface tangle with any engineered interface architecture.

So lifting and shifting isn’t as simple as lifting and then shifting, any more than moving a house is as simple as jacking it up, putting it on a truck, and hauling it to the new address. Although integration might not be as fraught as the house now lying at the bottom of Lake Superior.

Which isn’t to say there’s no legitimate reason to migrate to the cloud. (Non-double-negative version: There are circumstances for which migrating applications to the cloud makes a great deal of sense.) Here are three circumstances I’m personally confident of, and I’d be delighted to hear of more:

> Startups and small entrepreneurships that lack the negotiating power to drive deep technology discounts, and that will benefit from needing a much smaller full-time and permanent IT workforce.

> Applications that have wide swings in workload, whether because of seasonal peaks, event-driven spikes, or other drivers, the result is a need to rapidly add and shed capacity.

> A Mobile workforce or user base that needs access to the application in question from a large number of uncontrolled locations.

At least, this was the situation the last time I took a serious look at it.

But this isn’t a column about the cloud. It’s about the same subject as last week’s KJR: How to avoid making decisions based on belief, prejudice, and denial. The opening anecdote shows how easy it is to succumb to confirmation bias: If you want to believe, even vendor press releases count as evidence.

In that vein, here’s a question to ponder: Why is it that, after centuries of success for the scientific method, most people most of the time (including many scientists) operate so often from positions of high certainty and low evidence?

The answer is, I think, that uncertainty causes anxiety. And people don’t like feeling anxious.

But collecting and evaluating evidence is hard and often tedious work — not a particularly popular formula.

Isaac Asimov once started a Q&A session by saying, “I can answer any question, so long as you’ll accept ‘I don’t know’ as an answer.”

If Dr. Asimov was comfortable not knowing stuff, the rest of us should be at least as comfortable.

I think.

In the beginning there was dBase II.

Yes, II. There was no dBase I, and shortly after dBaseIV there was 0, as superior products eclipsed this, the original end-user app dev tool.

Fast forward thirty years to the present and it appears the entire EUC (end-user computing) category is failing. This makes no sense.

No, it isn’t extinct yet. There is, for example, the venerable Microsoft Access, although anyone who thinks Microsoft is giving it much attention isn’t paying much attention. If Microsoft had any interest in the product, it long ago would have become a highly publicized Azure development environment.

At least it’s economical: $110 buys you a license.

There’s QuickBase. I know little about the product other than that from a features and functionality perspective it looks promising. And it’s cloud-based. But it costs a user-unfriendly $180 per client per year.

Also, an alarm bell: Intuit recently sold QuickBase off to a private equity firm. For the most part private equity firms buy companies, starve the P&L of investments, and flip the company before revenues crash.

Draw your own conclusions.

Apple’s FileMaker Pro is reportedly a strong product, as it should be for $330 per user license. There’s also a cloud version, priced at, as one reseller, amusingly puts it, “from $1.63/day.” Let’s see … carry the 1 … that’s $595 per year, per user. I thought the cloud was supposed to be cheap.

These are three of the more prominent EUC products. Like I say, this makes no sense, given what we’re hearing from the trend-meisters: (1) Everything is moving to the cloud; and (2) IT is going the way of the dodo: Infrastructure is leaving the data center in favor of the cloud, while app dev is leaving the IT organization to become shadow IT, embedded in the business and out of control.

If shadow IT in the cloud is supposed to be a Next Big Thing, why aren’t the big cloud players — in particular Microsoft, Amazon, and Google — fielding products to cash in on the trend?

What’s particularly strange about this situation is that we are, for the first time, in a position to field application development environments that truly could make business managers independent of IT — that could take care of just about every detail of application design.

It’s now technologically possible to create:

  • Wizards that provide a dialog that results in a normalized data design (I’m old-fashioned) — one that makes use of IT’s APIs to provide meaningful integration and avoid the creation of duplicate data fields.
  • Automated form generation for PCs, tablet, and smartphones that flow naturally from the data design.
  • Visual workflow design tools, so systems can let users know there are forms to be opened and work to be done.

IT won’t be irrelevant in this new shadow/cloud universe we’re imagineering. But it probably does need to recognize the need to get out of the app dev business and into the integration business.

So far, this is just me grousing about the sorry state of the world — less an occupational hazard than a chronological one, but a hazard nonetheless.

What’s in it for you as an IT leader?

First and foremost, take integration seriously. It’s mostly a matter of solving a problem once instead of over and over again.

The key: Especially for IT shops that mostly license COTS and SaaS software and integrate it rather than building their own, build an architecture that makes systems of record and sources of truth separate and distinct.

Systems of record are maintained and managed by IT, which keeps track of which system is the central repository of what information and which systems have to be kept synchronized to the central repository.

Sources of truth are SOAP or REST-based APIs. When shadow IT efforts … and for that matter, formal IT efforts … need to retrieve or update information from the company’s official databases, they consult the sources of truth, not the underlying systems of record.

Next: if you want to do everyone a favor and not force them to make Excel perform unnatural acts, settle on a suitable end-user computing tool in spite of the state of the market, connect it to your APIs, and actively promote its use, both inside and outside IT.

You’ll be amazed at just how much more automation your company achieves, and how much more satisfactory it is as well.