The session was titled End User Computing. I thought it was a grammatical error … I expected it to be about end-user computing, not about how to end … as in prevent … user computing.
And to be fair, the panelists didn’t advocate ending it. One, a credentialed authority on security, pointed out … somewhat grudgingly, but she did point out … that the lockdown era is over. Given the proliferation of end-user devices and the increase in travelers, teleworkers, contract labor and so on, locking down every access point is no longer practical.
In its place is a better approach, which emphasizes protection of information assets rather than making hardening all access points the centerpiece.
The other panelist lead a team within IT responsible for developing apps and such for the company’s customers and employees use on their personal smartphones and tablets. His focus was creating innovative products for end-user computing devices.
And then they were done. Time for Q&A. Thinking I was tossing a soft, high one over the plate, I asked what they were doing to promote innovation by end-users.
Whiff! The second panelist spoke more about how he encouraged his team of developers to innovate. The security expert did note that some end-users were finding innovative ways to use tablets and smartphones, now that they’re allowed to do so. That was about it.
But encourage innovation? I might as well have asked, “Igli og slog, flub glubbly wub?”
It’s time for a trip down memory lane … back to the early days of personal computers. With limited storage and processing power, and networking still in the future, they were pretty much useless for serious, mainframe-style computing, which is why IT (back then it was MIS) considered them a pointless, uninteresting distraction.
And so they leaked in, hidden in office equipment budgets because end-users ate them up. Empowered by:
- Annoyingly limited but cheap and easy-to-learn languages like interpreted BASIC and Turbo Pascal,
- A sort-of-database-management-system (dBase II), and
- Thousands of new, inexpensive commercial applications written for these new devices,
- The sudden ability to ignore MIS entirely, doing whatever they needed to do, when they needed to do it, for themselves …
… they figured out countless ways to incrementally improve how they, their workgroups, and their departments operated. This is what made the PC a disruptive technology: Its first success came from providing the ability to do things nobody had done before, not from doing the same old stuff on a new platform.
Until the PC became powerful enough to gain a place in mainstream IT architecture. When that happened, in most companies IT gained control and put a stop to all the innovation, because of all the bad things that could happen if end-users were allowed to do whatever they wanted to with their no-longer-personal computers.
Now we have tablets and smartphones. For the most part they’re optimized for consumers, not business use (see “A tablet-driven view of what’s wrong with American business,” KJR, 4/25/2011 and “Tablets won’t be disruptive ’til the future gets here,” KJR, 5/2/2011).
From IT’s perspective they’re more annoyance than opportunity — a proliferation of browsers and form factors we need to support so employees can use them to get at whatever they need to get at.
And even this isn’t good enough. It isn’t because tablets and smartphones have the potential to be truly disruptive. That’s just prognosticating … they will be or they won’t, and we can deal with the disruption to existing marketplaces when it happens, just as we did with the PC.
It’s what we do with them in the meantime.
It’s true: If you let employees innovate on their own, Terrible Things might happen, especially if your company routinely hires stupid people and subjects them to inept guidance.
And yes, you could provide the tools and nobody will find anything useful to do with them. That could happen, especially if your company typically hires dullards whose primary virtue is showing up on time.
When PCs came on the scene, they drove a burst of innovation. A few of the most enlightened companies actively encouraged it. Most of the rest didn’t even know it was happening until it was too late.
Tablets and smartphones are the heirs to the original personal computer. Give employees half a chance and they’ll find new and interesting things to do on them that can help your business. The problem is that nobody can predict, in advance, what your employees will come up with.
It’s investing in and relying on the smarts, good judgment, and creativity of individual employees. What a radical notion.
Bob,
You are advocating a bottom-up approach which is smart and laudable. Unfortunately, it clashes with the top-down approach of 99% of corporate cultures. Your approach may take off in small and medium size businesses first since those firms need smart employees who can do more with less. Unfortunately, in government and multinational corporations, the top-down approach is favored, and only proven innovations from trusted vendors (if such a thing exists) will be adopted. Power and vision are reserved for the top people, not the peons whose wages barely allow them a subsistence living. It doesn’t matter that those peons may be smarter than their bosses like the engineers who developed the IBM PC whose bosses still thought the bread and butter of IBM would forever be mainframes. IBM gave away the chip business to Intel and the PC OS business to Microsoft. How brilliant was that?
John
Innovation? WTH? Who is allowing such free thought nowadays? With everyone so engrossed in just keeping the boat afloat, who in the heck is embracing “innovation?” What we are experiencing is the, “how can I support innovation, when I can’t support the day to day operation?” Forget, think outside the box, and all that crap…but seriously, who’s thinking of better ways to deliver IT services than the old traditional ways? Case in point…wireless in the government? WHAT? Stray away from copper? No wayt!!!….and VOIP? who the hell said THAT saved money…we can’t afford to implement, much less “save money” in the future!!!! Agggh. The stench of myopic …well, can’t call it myopic “thought” now can we? Since this is a column on “leadership”…let’s hope there are those who cling to the concept of true innovation, creativity, and the “art of the possible.” God bless the visionaries!
I’m feeling conflicted. Reading through that list, I’m recognising the tools I used to use in the early days of my professional and non-professional IT career to “get stuff done”. VBA in Excel and Word, interaction with COM objects, DOS script, etc. Handy for a proof of concept, building blocks to bigger things.
Then I look at what some of our clients are using “to get stuff done”. VBA (still), and Access. And then I cringe, because they build that stuff up, integrate it with their business process in their department, then try to push it company wide. But it just won’t play in a multi user environment, nor will it scale. Or even worse, the person doing the original innovation leaves, and the next time it breaks there’s no one to maintain or fix it.. except for their IT support. And they just look at it, and quickly look away. The horror of linked Excel and Word macros, dependencies on third party websites being scraped and not a single shred of documentation to describe what the bloody thing does.
Somewhere, I crossed a line, and I don’t remember when it happened.
> the lockdown era is over <
We should be so lucky! I'm in a "progressive" local government agency. Our IT management's prime directive is "lock down everything tightly, then figure out ways to make it even tighter." Security trumps everything else, and I actually heard a baffled IT staffer say in a meeting, "Why would we ask users what they want? We already know what they need."
It seems to me much of this is given by what I call “management by spreadsheet”. Too much of management is simply focusing on metrics and trying to make them look good.
Network reliability stats too low? Lock down everyone’s PC, don’t allow anything but IT approved software, don’t allow remote users.
Need to reduce equipment costs? Give users PC’s and laptops with minimal memory. Require multiple high-level approvals before users can install even IT-approved software that helps them do their job because, well, those $50 annual license fees really add up compared to a $50K annual employee salary.
Need to reduce employee costs? Pull all the IT support staff from the various divisions (where they actually interact with users and know what the company does) and put them in a centralized help desk. Then immediately layoff half due to “efficiencies”. When that’s not enough, rollout a knowledge base for user issues, force all users to use the knowledge base before contacting the help desk. Then layoff half due to “efficiencies”.
Of course, since IT is a cost center – unless the company actually markets its IT services to others – it doesn’t bring in a dime of revenue. It’s an enabler, a necessary expense. IT can justify its existence either by competing to be the lowest cost provider (competing against all the outsourcing vendors), or by knowing the business, helping every single employee be more effective, focusing on helping the business succeed. I’ve seen both and I know which I prefer.
Pingback: Google Chrome OS and Chromebooks news and reviews | Cloudipedia - Cloud Computing News
Pingback: Making the case for VDI: Lock them down and open them up | Blade Servers
Pingback: Making the case for VDI: Lock them down and open them up | Texas IT Jobs
Pingback: Is RBAC the true key to multi-tenant cloud security? | Cloudipedia - Cloud Computing News
Pingback: HostPlate | Shared Hosting ,VPS Hosting , Dedicated Server , Cheap hosting » News » Making the case for VDI: Lock them down and open them up
Pingback: Cloud Databases DaaS and the high speed rail | Cloudipedia - Cloud Computing News