HomeIndustry Commentary

What if SOA is a mistake?

Like Tweet Pin it Share Share Email

What if you decide Service Oriented Architecture (SOA) is a mistake?

Perhaps you’re concerned it’s a misguided attempt to build applications around generalities instead of specifics. Maybe you think object-oriented (OO) analysis had it right — that building a model of the things that make up a business is better than building a model of its processes.

Possibly, you’re among those who prefer informal conversations between business managers and programmers, and think user stories and use cases are blinders that limit developers’ ability to design creative solutions.

Whatever your objection, it doesn’t matter. In theory, if you have a need The Marketplace will satisfy it. In practice, the languages marketplace doesn’t work that way. SOA is in your future whether you’re shouting “hurray!” and leading the parade or grunting “harrumph” and wishing you’d taken up welding instead. To understand why, list what you want from a language.

The big four from OO-land — abstraction, encapsulation, inheritance, and polymorphism — would almost certainly be on your list. SOA’s run-time binding? Probably. Recursion? Maybe, if you’re a sophisticate. Performance? Sure.

If you’re a CIO or CTO running IT in a business that plans to be around for awhile, none of these occupy the top spot. What matters most is at least ten years of marketplace credibility. That’s what drives vendor support and a reliable supply of top-notch developers.

Like it or not, technology comes after this entirely logistical consideration. It’s why old-fashioned mainframe COBOL lasted as long as it did in spite of its technical deficiencies, and why companies are finally phasing it out.

Want ten years with confidence? It’s SOA via .Net or Java. That’s it.

SOA is a clear case of vendor push, not market pull. This isn’t necessarily a criticism: Vendors, not customers, should be the source of product innovation. Where customers can help vendors is product refinement.

My concern is that industry’s visionaries bet the farm on an unproven theory.

The chain of logic began with Hammer and Champy’s Reengineering the Corporation — the seminal work that first caused business leaders to think of their organizations as collections of processes rather than as collections of people, functional units, competencies, or what-have you. (And kudos if you recognized the false dichotomy: Organizations are collections of processes, and of people, and of functional units, competencies, and all the rest. Not or.)

While this was going on outside IT, object-oriented analysis and design was taking over inside IT. The connecting point: IT’s job, as then understood, was to automate one or more roles in various business processes. This made sense: Roles are things. Get their behavior and interfaces right and you should be able to string them together as necessary to automate some or all business processes.

If you really wanted to be cool you’d add business process management (BPM) software to the mix to handle the stringing.

SOA creates a deeper connection between the process view of business and how IT translates business requirements to software. Service is the software mirror of process. To do SOA properly means having a solid understanding of the processes, sub-processes and sub-sub-processes that make up your business and how they fit together.

Once you have that understanding you’ll be in a position to create software that supports the process hierarchy. And, as processes evolve, you’ll be in an excellent position to track that evolution with changes to service definitions.

It’s a great theory. It’s a really great theory.

It’s also an untested theory. Very few businesses understand themselves well enough to easily create a provable services model of the enterprise; those that don’t and dive into SOA first have to create one, at considerable time and expense. And that doesn’t take into account the need to decide whether to normalize responsibilities that can’t or simply haven’t yet jelled into formally defined processes.

Then there’s the potential loss of flexibility. SOA can almost certainly make companies faster. Speed, though, is quite different from agility. Speed means doing what you do more quickly; agility means changing what you do quickly.

Services are supposed to be loosely coupled, to facilitate agility. That’s the theory, at least. Maybe it will turn out that way. I hope so.

Lost in the shuffle is something basic: Programmer productivity. Friends who are hands-on with such matters tell me the available SOA development environments are less than half as productive as products like PowerBuilder and Delphi were, back when they were viable.

It’s an unnecessary lapse, and tells me those who are driving SOA need to spend more time with those who are using it.

Comments (13)

  • My experience with SOA is your application links in one way or another with other applications or data feeds to accomplish some business thing. The linchpin is configuration management. How many of these SOA proponents also expound to executives that every SOA application needs a seat on the configuration control board of every application and data feed it touches–and vice versa. If you don’t do this, you have no idea what your “partner” is changing–and he or she has no idea what you are changing–until something goes wrong.

  • Love your take, as usual. Merely have to complain (lightly) at the slam to both Delphi and Powerbuilder. A couple of years ago…sure. Both are now back and growing and current and relevant. Really.

  • Bob, great column as usual. I’ve been banging on about your last point for that past couple of years.

    In my view, what makes a programming language really productive is Notepad. Or vi, or emacs. I mean … the programming language has to be able to be programmed with a simple editor. Yes, an advanced IDE will make things more productive, but the basics must also apply. Now a lot of SOA environments are not programmable without the IDEs. Even worse, the IDEs are often custom jobs that require a developer to be re-trained … losing years and years of productive speed with muscle-memory style automatic ability to navigate the programmer’s usual editing tool. Seriously. This stuff is whack. A program language or an environment needs to be IDE-neutral. If you got a plumber around, would you insist that he only use the tools you supply from your home handyman kit? Or would you expect the plumber to have mastered a set of his own tools already? And making matters worse it’s rarely programmers that choose these tools which are foisted on them too. The server/deployment environment and the language used to implement need to be decoupled from the tools used to build it.

    But an even *worse* failing of these SOA tool suites, is that they all strongly and irrevocably coupled to the deployment/runtime environment. Generally they totally lack the ability to keep up with modern programming ideas. Like for instance, automated testing. Or even unit tests, let alone advanced and productive techniques such as Test-first approaches or Test Driven Design. The tools often lack refactoring support. All of these things are in my opinion, and in the opinion of many leading developers, absolutely essential to quality engineering practice and agile development outcomes like “delivery of working software”. The working part needs to be repeatable. That’s why automated testing is essential in modern development. But the development tooling is completely a barrier to achieving this.

    But you won’t hear any of this from the big vendors. One big vendor recently announced their new version of their middleware product suite had a ‘focus on testability’, but you ask any of their presales guys to demonstrate this in an actual development environment. Ask them about continuous integration support, for example. Witness their blank looks. Their development product is completely orientated to “one button push from the IDE to production” modes of thinking the idea of continuous integration builds is almost totally antithetical to the very concepts of operation the product is organised around. They think that finally adding support for Subversion version control system, at least five years too late, is a wondrous achievement.

    They are aiming for ‘programmerless programming’; of course in process just creating a new type of programmer. Every new generation of programmers simply have to learn the same hard-fought lessons of software engineering over and over again because each generation of tooling apparently scraps the paradigm over and over again in a vain attempt to create push-button, wizard-driven programming models. They nearly all suffer from ‘hello world’ programming – the simple examples they sell them to IT management with are trivial to conquer using the wizards but more complex problems (i.e. real world ones) are flat-out impossible. Thus these tools are always mirages which look great at a huge distance but are flat lifeless salt pans of bleached skulls and bones on closer examinations.

    As you might be able to tell, I am utterly contemptuous of these SOA tool paradigms. I have nothing against SOA per se. But there is nothing more productive than a programmer who understands the importance of simple and repeatable build and deployment automation using command line tools and who knows his program *editor* inside out after ten years of use. Give that programmer a better language by all means, add incremental features to that IDE, allow the programmers to continuously improve their techniques, promote professional craftsmanship, yes, yes and yes. Bu no amount of drag and drop wizards, push-button deployments, and “object inspector” property editors will ever usurp that.

  • This got me thinking about fractals – the deeper you dig the more details you reveal and the further down you can go. Add in chaos theory – small changes in intiial conditions can have profound impact on the final outputs.

    SO, it seems to me that the challenge will be to know how deep to go to get the details that make up the ‘initial conditions’ and which are just noise.

    I wonder if SOA will prove to be as broadly applicable as it proponents claim it to be.

  • Bob, You hit it right on the head with your statement
    “SOA creates a deeper connection between the process view of business and how IT translates business requirements to software. Service is the software mirror of process. To do SOA properly means having a solid understanding of the processes, sub-processes and sub-sub-processes that make up your business and how they fit together”. Translating the sub-sub rocesses into software is where the magic needs to happen or a miracle. The Holy Grail is how to manage to optimize reuse. Have yet to see anything close in the insurance industry. BTW we still have to have a logical and physical information model that binds the services and the information together. I guess SOA allowed the business to participate more than OO, so it has been more accepted as OO was more techie

  • I guess the underlying issue is whether all IT “trends” are simply cycles. A new concept is created, it gains momentum, the consulting firms jump all over it, more vendors get involved, et al. Then the cycle peaks and begins the downward shift as firms don’t achieve the touted benefits and a new IT concept starts getting attention.

    Having worked through the “breakthroughs” of PL/1, 4th GL’s, prototyping, architectural “standards”, PowerBuilder, OO, etc. along with the shifts across hardware/platforms, and web coding languages, etc., it’s debatable whether real increase in value has been achieved for businesses or whether we in IT have simply been seeking the Holy Grail.

    Time will, of course, answer that question, but if past history is an indicator, we are doomed to keep creating new cycles.

    Jus sayin’….

  • Like every other programming paradigm, success or failure of a given project will ultimately depend on implementation. If a company can successfully identify and model their business processes, the programmers can create a solid, logical representation of those processes in SOA, and the result can perform well enough to conduct business with, you should succeed. If you miss on any of those three, as soon as it becomes politically expedient some manager at your company will proclaim SOA technology inadequate, and either swear that the “next big thing” is the ‘real’ answer, or call for the return of COBOL and the mainframe (or whatever paradigm was used during their personal ‘good old days’).

  • Just a friendly nitpick, since this is frequented by IT professionals…shouldn’t the False Dichotomy mentioned above be: “and all the rest. Not XOR”, rather than ‘or’?

    Good points, all. I’ve passed this on to a number of friends who are still coding, rather than going hardware as I have.

  • Bob, SOA is just the latest programming product to arrive without including information security into the original design. Vendors that push technology that is half-baked are looking for some quick revenue generation until the next new thing arrives. Business leaders that fall for those sales tactics fail to observe their fidiciary responsibility to make decisions from the viewpoint on an ongoing entity. Although I have no data, I suggest that these same business leaders, like the vendors, are looking for some quick revenue generation until their next new opportunity arrives. Thus, they can safely ignore their fidiciary responsibility, since it will become someone else’s problem to clean up their mess.

  • Very good points Bob.

    I agree that SOA was a pure vendor push. I have never seen such a sustained campaign by product vendors. It is like when during the ’70s the tennis racket companies changed the size of rackets, realizing that everyone would then have to buy a new racket. By changing standards and they create demand.

    But the truth is that the SOA technologies are no better – indeed I think are far worse – than their predecessors that could be used to achieve the same things.

    For example, one can create a SOA environment within an enterprise using any messaging technology – any of the messaging or RPC technologies that have been around for decades. Except that these earlier technologies were massively more efficient and more reliable.

    I agree with other comments here as well. Most of all, I think that things have become artificially complex. The vendors have pushed so many layers on us. Most of these layers add little value, if any. And the most recent standards are often the most complex. For example, WSDL is an order of magnitude more complex than what it needs to be: all programmers want to do is pass a message for crying out loud. What used to take a snippet of IDL now takes 10 pages of WSDL – to do the same thing, and do it ten times slower.

Comments are closed.