HomeCloud

Carr-ied away

Like Tweet Pin it Share Share Email

Nicholas Carr has a new theory — that internal IT is reaching the end of the line, because information technology will follow the same commoditization curve that electrical utilities followed a century ago.

Okay, it isn’t really a new theory, although from the attention Carr is getting for his new book, which should have been titled The Joys of Griddish but instead is called The Big Switch: Rewiring the World, from Edison to Google (W. W. Norton, 2008), you’d think he had come up with it all by himself.

Carr, you’ll recall, previously theorized that IT doesn’t matter (“We ain’t there quite yet,Keep the Joint Running, 6/16/2003). His reasoning: Every business has access to the same information technology, so IT can’t provide a sustainable strategic advantage.

That his old theory was fatally flawed is easily demonstrated: Every business has access to the same everything as every other business — the same technology, ideas, people, processes, capital, real estate, and silly articles published in the Harvard Business Review because their authors were once on its editorial staff.

Were we to accept Carr’s past logic and apply it equally to all subjects, we would despairingly conclude that nothing matters. Now, not content with turning us all into depressed nihilists, Carr has discovered (and we should be pleased for him) the Internet and the possibility of outsourcing all of the computing cycles of every business to it.

What Carr has visionarily discovered, while tossing in terms like grid and utility computing to prove he is Fully Buzzword Compliant, is IT infrastructure outsourcing, a mere three decades after it began. Meanwhile, many very large corporations that outsourced their IT infrastructure have found that economies of scale reach a point of diminishing returns — enterprises reach a size where running their own data center costs less and provides more value than contracting with an outsourcer.

But never mind this little quibble. After all, many businesses aren’t that big and data center outsourcing does make sense for them. It’s nothing new and makes no difference. It’s business as usual right now, and companies still need an IT organization, because …

Applications and the information they process are where the IT rubber meets the business road. Computer programs are not indistinguishable from one another. The information in the data repositories they control is unique, valuable, and (assuming corporations are careful about information security) private.

Carr hasn’t entirely ignored this reality in “his” theory of utility computing. He merely waves it off as trivial — something easily solved through a combination of Software as a Service (SaaS, which if you’ve been asleep for awhile means hosted solutions) and … here’s an exact quote … “the ability to write your own code but use utility suppliers to run it and to store it. Companies will continue to have the ability to write proprietary code and to run it as a service to get the efficiencies and the flexibility it provides.”

With unparalleled perspicuity, Carr has figured out that companies can write their own code and then run it in an outsourced data center. Hokey smokes!

Carr’s New Insight is that responsibility for applications will move “into the business” which is why IT will eventually go away. He endorses the notion that businesses can easily integrate disparate SaaS-provided applications and databases across the Internet using a few easy-to-use interfaces.

What nonsense. Most internal IT organizations long ago changed their focus. They seldom develop. Mostly they configure and integrate purchased applications.

Nothing about this is easy. Integrating multiple applications and their databases takes complex engineering, not facile hand-waving. Moving responsibility “into the business” means nothing more than managing multiple, smaller, poorly cooperating IT departments instead of single, larger centralized ones. Ho hum.

Nor can integrating multiple SaaS systems work in a high-volume production environment. That’s because of a concept network engineers but not self-appointed “experts” understand: latency.

Imagine a financial services company. Customer management is SaaS in California. loan operations is SaaS in Massachusetts. You have to update 10 million customer accounts every day with interest computations. The minimum latency imposed by the laws of physics on an ordinary two-table join adds more than 45 hours to this little batch run.

Well-integrated computing environments come from serious engineering. Phrases like utility computing and grid might obscure this fact behind a fog of vagueness. They don’t eliminate it.

I have my own vision for the future of IT. In it, only people who have written code, designed databases, administered servers or engineered networks at some time in their careers will get to write about IT’s past, present and future.

The rest can include themselves out.

Comments (4)