In state fair circles, deep fried butter on a stick is considered a delicacy. In reasons-to-distrust-scientists circles, some of the more recent findings that cast doubt on the fat-to-bad-health linkage are also considered delicious.
Me, I’m skeptical of the skepticism. If you aren’t, go ahead and eat as much deep fried butter on a stick as you like. I’m sure it’s the road to robust good health. And if it turns out not to be the road to robust good health, I’m just as sure the Tooth Fairy and Santa Claus are skilled at anesthesiology and open heart surgery, respectively, so you’ll still be okay.
Something I’m less skeptical about is the so-called “digital revolution” in business — as described last week, the confluence of social media, cloud computing, big data, the next-generation workforce, smart products, the “internet of things,” mobile computing, and probably one or two other Next Big Things.
The KJR short version of the digital revolution: In the world, technology is pervasive. In many businesses it’s still a case-by-case decision.
Yes, there is a certain overhyped trendiness to the “digital revolution.”
But there is a revolution in the works. What it is: If a computer can do something, someone is making a computer do it somewhere. They’re probably giving away an adware-supported dumbed down version for next to nothing, too, to generate a bit of revenue and more than a bit of interest.
We’ve entered, that is, an era in which computerization is assumed, and is no longer considered the least bit remarkable or threatening.
And, we’ve entered another era of experimentation — one that rivals the experimentation triggered by the original personal computer in the late 1970s and early 1980s, by the World Wide Web in the 1990s, and, for that matter, the literary and artistic experimentation that accompanied the 1960s.
I’ll leave it to you as to whether Andy Warhol was more or less important to society than Reddit.
The hyping question is whether the digital revolution technologies — social media, mobile, big-data and the rest — will provide as much “value add” for business as, say the database management system did when it first appeared.
You have to be careful about “value-add.” It should include avoidance of “value subtract.” Just because something added its value a long time ago and has faded into the infrastructure, that doesn’t mean it’s any less present and important.
Like the DBMS. When these puppies were new and shiny, IT had to carefully and thoroughly demonstrate their value-add. Being new, and expensive, and — from the perspective of those who have thumbs-up/thumbs-down authority over capital proposals — IT’s latest and greatest shiny ball, the need for this latest latest-and-greatest technology was highly controversial.
35 years later, the notion of building a useful business application without a DBMS is somewhere between quaint and stupid. The only question is which one to use.
The DBMS is as much an assumed part of the IT infrastructure as Twitter is an assumed part of the political and celebrity infrastructure: Even if you don’t know exactly how it works, you know it’s out there and a lot of people seem to rely on it.
Developers building a new application assume they have a DBMS — it’s just there, just as for political consultants building a new campaign, Twitter is just there.
In the world, the digital-revolution technologies are just there. Many businesses, though, aren’t baked that way. They need a path out of their business archaism. Here’s a three-step program to get them there:
Step 1 is KJR’s increasingly tiresome stump speech about there being no such thing as an IT project. Or shouldn’t be. It’s always about business change — designed business change — or it’s a project with no point. And while this statement ought to be clear, in my experience it isn’t, so … no, this isn’t the same thing as saying IT projects should always have business benefit. Once more, with feeling: Projects. Are. Always. About. Business. Change. Earth. Person.
Step 2 is for business executives, and for that matter for managers at all levels, to consider knowledge of information technology to be part of their job description.
Not how it works. Not how to implement it. This being the 21st century and all, those who run businesses ought to understand the fundamentals of what they run on.
Step 3? Stop making case-by-case decisions about technology.
No, that isn’t quite right. Executives will still have case-by-case decisions to make about the use of technology.
Only now, the case-by-case decisions will be when to not use it.