Did we give up too easily?
We used to automate processes. Back in the early days, before we called ourselves Information Technology, Information Systems, or Management Information Systems, that’s what we did.
We didn’t optimize processes, orchestrate them, or develop metrics to assess their performance. We obliterated them, replacing them with computer programs that called on humans only when an input or decision was required the computer couldn’t handle on its own.
To be fair, we didn’t always do this all that well. Some of these automations sped up business processes that contributed little business value beyond keeping some people busy who otherwise might have caused all sorts of mischief.
There were also quite a few cases where the program paved a metaphorical cow path, faithfully reproducing in automated form every process step that had previously been executed by a bored human being, even if these steps had nothing at all in common with what an engineer might construct if given the goal and a blank sheet of paper.
But even with these flaws, IT’s predecessors delivered orders-of-magnitude improvements in business efficiency. And then Something happened, and suddenly, overnight, IT became the biggest bottleneck to business improvement.
My theory is that Something = Methodology, but perhaps I’m being unfair. I’m not, after all, a business historian, so while I lived through some of the mayhem, my personal mayhem experience isn’t a statistically significant random sample.
Based on my personal experience, direct and second-hand through colleagues, here’s the way process automation happened back in the good old days we neocodgers see in the warm glow of imperfect memory:
Someone from the business would drop by EDP (electronic data processing, IT’s ancient forebear), sit on the corner of a programmer’s desk, and ask, “Can you get the computer to do x?”
After a short discussion the answer was either yes or no, and if it was no, a longer discussion usually led to a useful alternative the computer was capable of.
The programmer would go off and program for a week or so and call the business person back to review the results and suggest course corrections. In not all that long the computer had automated whatever it was the business person wanted automated.
Usually, in the interim, other notions occurred to the business person, who, while reviewing how the solution to the initial request was progressing, would ask, “Can you also get the computer to do y?”
Over a span of a few years these solutions to business problems accumulated, turning into the big legacy systems many businesses still rely on.
If we’d only had the wit to call what we were doing a methodology and label it Agile.
Had we done so we might have avoided quite a lot of the discreditation that happened to IT during the years of waterfall wasteland that, starting in the early 1980s, transformed us from the department that automated stuff, generating enormous tangible business benefits, to the Department of Failed Projects.
For that matter, had we continued in our quest to automate the bejeezus out of things in our naively Agile sort of way, disciplines such as Lean and Six Sigma might never have achieved their current level of prominence.
Not that Lean and Six Sigma are terrible ideas. In the right contexts they can lead to solid business improvement.
What they’ve turned into for some businesses, though, are Strategic Programs, and religions for some practitioners. For these devoted adherents they’re the answer to all questions, before actually asking the questions.
What they’ve also turned into is a sort of IT-less shadow IT — a way to improve business processes without any need to involve IT, and, more important, without having to ask the Department of Failed Projects to deliver very much.
Let’s imagine the executive team at some enlightened (or misguided — your call) company reads the above and, convinced, calls to ask how best to return IT to its process automation roots. What would a modern version look like?
Mostly, it would treat each process as a black box that turns inputs into outputs. IT’s job would be to understand what the inputs and outputs are, and to develop, through a combination of inference and listening to the experts, an algorithm that reliably turns the defined inputs into the desired outputs.
That’s it — the entire methodology in one paragraph, understanding that “algorithm” can hide a lot of complexity in its four syllables.
Might this work? Beats me. It’s an idea I’m just starting to play with. Next week I might strenuously disagree with this week’s me.
Don’t hesitate to weigh in.
I think you may have lost it between the paragraph “Let’s imagine” and “Mostly.”
I read a web post recently that talked about the “progress” from procedural languages to OOP. For me, that was the jump from useful (comprehensible to a person such as myself with a task to do and a computer to use)and a reversion to giving back what I was doing to the priestly caste of computer people behind the curtain. In other words, moving from what real people could understand and do with computers.
You made the same leap from “process automation” to “black box” and “algorithm.” Of course zillions of useful problems have survived this recasting. However, much of what humans need to do does not fit neatly into inputs and outputs. One ancient example: I used to work in the library department where journal issues arrive and are checked in. Mr. computer guy set up a system that said, o.k. this title is a monthly, we’ll just predict the next issue and all you have to do is click when you get it and voila we won’t make human mistakes. I commented, what if the next issue is a supplement, or there is a special issue this year, or the publisher skips volume 13 because he is superstitious…
A process works already. An algorithm can predict most of its world and work at a useful level when welding a car, for example, but without access to the black box, a lot of processes that work in the real world become overwhelmingly complex (due to the contingencies that have to be foreseen and included) or impossible.
A modern version would probably not have to be explained to computer priests who are unfamiliar with the work to be automated. That means at least a prototyping environment that can be employed in the real world for which the automation is intended. I was able to work in that world for a little while, and it was great.
Michael Hugos provided the solution to this a few years back in one of his books. Not that he invented it – it’s a solution that goes back to the days of punch cards. The solution is to include in the black box’s algorithm a branch that kicks out unanticipated exceptions for human handling. It’s a lot like accepting “I don’t know” as a valid answer to a question.
Hi Bob,
My observation is that turning a non-automated process into an automated process is comparatively easy.
What’s hard is what you do once you have automated something and you need to change how it’s automated.
Any kind of transactional automation means data. Data can have various levels of granularity — too much is expensive to capture, too little may prevent you from doing analytics and business intelligence effectively.
Your black box idea is a good one. It bears some resemblance to Roger Sessions’ snowman model.
But when organisations have spent 10-20 years interconnecting the data models of various systems — treating them as white boxes rather than black boxes — then there’s a huge process of data de-spaghettification which is required before a black box model can even be implemented.
Worse, any project that tries to pull out a single component of a spaghettified system has a huge likelihood of failure.
Shadow IT is what happens when people give up and just implement process automation of their area as if there were no prior attempt at corporate automation — ie doing the easy bit all over again.
I think we need to add “despaghettification” to the official IT lexicon. Thanks!
What a relief! All this time I thought I was getting old. Thank you for informing me that I have been wrong. I am not older, I am a neocodger.
Bob,
Your memory of how we “automated” back in the day, is completely in sync with my own, the biggest difference I see between then and now is mostly semantics, the same work is sort of being done(mostly between the endless meetings to discuss the appropriate methodology to use and manage careers) If programmers and business people just talked at one of their desks or better yet where the work is currently being done, so the programmer can see what they are automating(cool concept), things would still be getting automated. Agile? Waterfall? Really? Just do the damn work! I get sick everytime I hear someone say the word “Story” now……sad, I used like to read stories to my kids… another word that’s been highjacked from its original meaning.
thanks for allowing me to vent,
a Grouchy old man.
This seems to miss the ‘what’s already there’ (WAT) problem. If nothing exists, you can take the algorithm and implement. Some folks are fortunate to have ‘green fields’ to start. For others, the algorithm may be developed in short order, but to implement is more like adding new infrastructure under London, Rome or other old major city – you keeping bumping into old stuff, dead bodies and other surprises that slow things down.
The comment by Judy Meyers shows that one of the primary principles of automation is often ignored – feedback. Judy describes roughly what we would call feedforward control: if you can predict the future from the present measurements because you have a perfect model, then you don’t need any feedback. There are no perfect process models! So we automation engineers depend heavily on feedback control which says to measure your “product/output” and make adjustments to your controlled variables to eliminate those output deviations. We also know that there are often golden opportunities to make some adjustments to the process before any large output deviations occur by using some feedforward control based on rough process models. The combination works well. It should also work well for business processes, but there are few/no automation engineers involved with IT. This is the shame. When is skill is needed and those in charge don’t know it’s missing. They don’t know, and they don’t know that they don’t know. We call that an error of the worst kind.
Bob,
To add to the Management speak: “His new role will be as a consultant”: (he kept some files)
“We wish him well in his new endeavors” : (he didn’t keep any files)
Keep up the good work!
I agree with this week’s Bob Lewis. There is room (and need) for planning, strategy, methodology and frameworks. Yet, most of what needs to get done, does not need to do so under the umbrella of the most complex versions of the above.
If we’d stop pretending that we occasionally need to rebuild things from scratch every now and then, we’d be more likely to build things with a better balance of modularity and purpose/focus.
Instead, we vacillate between extreme focus and ultimate modularity, leaving us with either too much complexiity or too narrow a focus.
-ASB: http://XeeMe.com/AndrewBaker
I think that the “yes” should result in the same kinds of discussions that a “no” gets. Always remember the story of the woman cooking the ham.
It seems that a young woman gets married and her new husband actually likes hanging out with her. He is hanging out in the kitchen while she is making a ham. She slices the ends off and puts them on the sides of the pan. He asks her why she does it that way. She says that it tastes better besides that is how her mother did it.
So when they are visiting her mother he has to know and asks the mother about this. She also confirms that it tastes better and besides that is how my mother did it.
At the big thanksgiving dinner he really is curious about this and asks his wife’s grandma. “YOu mean that they still do that, when your wifes mom was little, that was all the bigger pan I had”.
Great lesson for IT professionals in this one.
I thought everyone unemployed was now known as a “venture capitalist.” Once again, I’m either behind or ahead of the times. Durn.
>> What they’ve also turned into is a sort of IT-less shadow IT — a way to improve business processes without any need to involve IT, and, more important, without having to ask the Department of Failed Projects to deliver very much.
So, shadow IT implements their local solution, for substantially less money than IT suggested. The shadow IT asks real IT to “integrate” it with real data. After a while, shadow IT calls real IT because they have a minor problem — their “programmer/consultant/expert/…” got another engagement. Would real IT please send someone down to takeover “the system.”
Real IT: what language is it in?
Shadow IT: huh?
Real IT: what hardware does it run on?
Shadow IT: Windows 95
Real IT: That’s not hardware; that’s an obsolete operating system.
Shadow IT: ummm, OK, whatever. But, we have a real problem. The system won’t run.
Real IT: what kind of backup are you using?
Shadow IT: what?
Real IT: how’s your system and data backup stored?
Shadow IT: what?
Real IT: who knows how the system actually works?
Shadow IT: Charlie. He’s a friend of the department manager. But he moved to Timbuktu for a new assignment.
Real IT: hello? hello? durn. The stupid phone system failed again. You just can’t trust this high tech stuff…
Shadow IT: Hello? Hello? Hello?
I think a big part of the issue is user-driven. In your EDP example, someone who understands an existing, working process is an inherent part of the team, and trusts the answer when they hear that the computer can’t do that. Feedback loops are tight, and the product is end user focused.
My experience, which I doubt is unique, is that the common expectation these days is that IT “just” needs to convert a vague, second hand description of a process some executive thinks will make money (but which might not even already exist even in a non-automated way) into usable or salable software, and magically plan for all the use cases that no one had the experience or background to articulate, and by the way, make sure the result complies with Sarbanes/Oxley. We don’t know what Sarbanes/Oxley is, at least not well enough to describe it, but we know it’s important.
The business side of the house in my experience is impatient with at best, and at worst, oblivious to, the need for converting someone’s pet idea into a list of requirements the product can be tested against. The IT side of the house often gets thrown under the bus as uncooperative just for asking for essential levels of detail.
If business decision makers could embrace the idea that they have to get at least as far as a flowchart before asking IT to get involved, I think everyone would benefit.
IF you want a new process or approach, how about starting with defining the problem. And the definition is not: We need a faster way to do this.
Why are you doing this process? is question 1. What business value do we hope to get out of it? If you get 7 or 8 people in the room and start talking like that, you’ll probably be able to design the system used for the next 20 years (along wit the necessary accretions).
Talk about the problem and ideal outputs and insights and don’t automatically think ‘computer’ or ‘app.’