HomeCloud

A cloud spiral of death. No, not Sandy. The other type of cloud.

Like Tweet Pin it Share Share Email

Xcel Energy has asked the Minnesota Public Utilities commission to approve a 10% rate increase. This matters to everyone interested in cloud computing (I think). Here’s why: A major reason for the request is falling demand for electricity.

The connection isn’t clear?

Early in the days of cloud computing, Nicholas Carr’s ridiculous-but-nonetheless-highly-influential The Big Switch: Rewiring the World, from Edison to Google (W. W. Norton, 2008) proposed strong parallels between the evolution of electrical power generation and the coming evolution of information technology provisioning.

While it was mostly ridiculous  (see “Carr-ied away,” Keep the Joint Running, 2/4/2008), power generation and computing-over-the-internet do have one common characteristic ­– that when lots of customers are able to share the use of large, centrally owned, commodity resources, economies of scale drive down costs.

It’s a great theory. It rests, however, on a number of assumptions, some of which have already been subjected to real-world testing by electrical utilities. For example:

Assumption #1 — providers can get infrastructure for less: Electrical utilities can build and operate power plants more cheaply than consumers or businesses. It’s true, except when it isn’t: Some manufacturers, for example, own their own hydroelectric plants because that’s more economical than buying power, and some consumers are installing solar panels on their roofs, providing a significant fraction of their total need for electricity.

It’s the same in the cloud, only more so, because the raw cost of computing infrastructure is so low, and margins are so thin, that most companies can buy the same stuff cloud vendors rely on at pretty much the same price. A similar equation applies to managing it all.

Assumption #2 — uncorrelated demand: Start with scalability and flexibility. Cloud providers invest in fixed costs so as to decrease incremental costs. That’s called scalability — hardly a new concept in IT. When that’s what IT needs, the economics of Infrastructure as a Service (IaaS) and Platform as a Service (PaaS) don’t work, because, see Assumption #1.

But in many circumstances, what businesses need isn’t scalability, it’s flexibility — the ability to add and shed capacity as the processing load varies. The reason highly scalable cloud providers can sell flexibility to their customers is that they rely on different customers needing the same resources at different times, averaging each other out. While individually their demand varies, in the aggregate demand is predictable.

This only works, though, when customer demand is uncorrelated — when their individual computing loads are unpredictable.

But for a lot of companies, variation in demand is very predictable, the result of having seasonal businesses. The holiday season, for example, affects lots of companies exactly the same way. Their computing demand is correlated, very much parallel to what power companies face in the summer, when everyone runs their air conditioners at the same time.

Except that power companies can handle peaks by buying electricity from each other and from independent generation companies. Cloud providers can’t. They need enough infrastructure to handle correlated peak loads, reducing their economies of scale. How much? The industry is too immature for us to know the answer yet, which brings us to …

Assumption #3 — Growth: Cloud computing doesn’t just shift the cost of infrastructure to providers. It shifts risk as well, namely the risk of excess capacity.

Call it the dark side of scalability, which is that when the incremental cost of processing an increase in volume is small, the incremental savings when processing a decrease in volume is just as small.

Welcome to Xcel Energy’s world.

Imagine a cloud provider whose demand starts to fall. Their fixed costs don’t change, just as Xcel still has to maintain its power plants, even when their capacity isn’t needed.

Unlike Xcel, cloud providers don’t need a PUC’s permission to approve a rate increase. They need the marketplace’s permission.

It’s Hobson’s choice. They either lose money by keeping their rates competitive, or enter a death spiral by raising their rates enough to be profitable, leading to customer defections, leading to more excess capacity, leading to a need to increase rates even more.

Even the biggest providers are vulnerable. Maybe more so, because commodity businesses have razor-thin margins to begin and they’ll have the biggest infrastructure investments.

So to the extent you migrate critical applications to IaaS or PaaS providers, make sure they’re fully portable. And add the steps needed to move them to a different provider to your business continuity plan.

Just in case.

Comments (2)

  • Deja vu, Bob? Didn’t we see this about 20-25 years ago before all the big data centers started to shut down? As they shed applications and users, the fixed costs were allocated out to fewer and fewer corporate departments, raising their costs to sky-high levels.

  • Here in Silicon Valley the same thing keeps happening with the Santa Clara Water District (aka the Golden Spigot). They keep pushing us to conserve water; the customers do a good job of reducing water use; the Golden Spigot raises rates to make-up for the loss of revenue; repeat steps 1 thru 3.

Comments are closed.