Hidden assumptions have power — more power than evidence, than logic, than the formal axioms and postulates required for mathematical and geometric proofs.
We use them to reach conclusions just as surely as Pythagoras relied on the parallel postulate, and because they’re unconscious we don’t know we’ve based our decisions on them.
We’ve been talking about the so-called digitalization of business and one of its most important but rarely mentioned driving forces, the ETG — the embedded technology generation. The ETG stands one of IT’s hidden assumptions on its head, namely that your average employees paint Wite-Out® on their computer monitors to correct word processing mistakes.
So listen up: The ETG has never heard of Wite-Out®. And while very few generational generalizations hold up to even the shallowest scrutiny, this one does: Your average member of the ETG doesn’t think twice about learning new technology. That’s assumed, and often fun.
And they certainly don’t worry that different systems have different user interfaces. That’s assumed too — otherwise, having learned how to use Reddit they’d get all flustered and nervous over the prospect of also figuring out World of Warcraft.
But they don’t. There’s a reason the iPad doesn’t come with an instruction manual, and it isn’t that the iPad’s user interface is so intuitively obvious that anyone who picks one up automatically knows what to do with it.
Nope. It’s that the iPad’s early adopters figured they could figure it out by just poking around, and anything they couldn’t figure out that way they could Google, just as they can Google cheat codes for a game if they get stuck.
Apple understood this mentality and took advantage of it. That’s Apple’s hidden assumption.
Your average IT shop, on the other hand, is built on the assumption … the hidden assumption … of widespread employee technophobia.
There’s one place this assumption does partially stand up, though, and that’s in many companies’ executive suites.
Understand, this isn’t because the executives in question are stupid or ignorant. It’s because (and I’m going to hate myself for explaining it this way) they only understand the technologies underpinning the digital revolution (social media, big data analytics, mobile, and all the rest) in their heads. That is, they can and often do understand the evidence and logic supporting their importance.
But they don’t get it in their gut. (See? I told you I’d hate myself.)
Here’s what I mean:
In his groundbreaking Thinking, Fast and Slow, Daniel Kanneman explained where non-linear-logic-based thinking is entirely reliable, and even preferable. The perfect exemplar: recognizing people you know. You don’t need to create a logical narrative to prove the person you’re looking at is your old friend Frank. You recognize Frank’s face and that’s that.
For you, that is. If you want to prove it’s Frank to someone else, that’s when you need to produce his driver’s license.
Relatively few corporate executives are members of the ETG, and as a result they need the driver’s license. They don’t have an internal picture of how it all hangs together and makes easy sense. No matter how receptive they might be to the idea of a digitally transformed organization, they can’t live the reality the same way they recognize a face.
Does this mean they have to start hanging out on Facebook, sharing photos with Snapchat, and arguing with total strangers on Reddit, tossing out f-bombs while they do?
Mebbe, mebbe not. A lot depends on where their customers hang out and how they spend their time.
Either way, IT doesn’t have to be all that involved.
It’s like this. Many television programs now sport interactive websites that are complements to the show, to be experienced concurrently, in real time.
If someone had asked my opinion of the idea, I’d have waved it off as pointless. I want to be immersed in a program I’m watching. Visiting a website? It destroys the experience.
For me, that is, along with my fellow neocodgers. Programming aimed at us shouldn’t have real-time mobile or website accompaniment.
But the ETG multiplexes by long habit.
I imagine they do while at work, too. You aren’t just selling to the ETG. You’re hiring members, too, and they multiplex when they’re your employees, too. Have your workplace policies kept up?
Too often, our industry’s prognosticators talk about digitalization as a portfolio of discrete, possibly complementary game-changing technologies.
But it’s more profound than that: It’s a different mental model of business, one in which pervasive technology is assumed, not decided.
Nice article as usual Bob.
An anecdote and a comment:
– I was very disappointed that one of my best line managers was always looked down upon by senior executives because “he had time to check Facebook at work”. It didn’t seem to occur to them that this was part of his broader learning network.
– Just because the ETG *can* learn interfaces without help doesn’t mean we like to. The current practice of constantly reworking interfaces for “better usability” is a major bugbear, especially if they are frequently used. My rule if thumb is that the more something is used, the less the interface should change.
Digital fist bump. Going to miss your muse when you move on, I’m a Gen X guy who sees all of what your saying, from the c suite and the drivers license and the noobs coming on board. I’m in the middle of that – can’t relate to either beast haha.
I’m not so sure. If the ETG are capable of figuring out how systems work without any training, how come the vast majority of screens I see at my clients show Office with the ribbon bar taking up a large chunk of screen real estate (and lots of bitching about it), when a single click allows it to ‘autohide’?
Rolling out new software (and new software versions) without any training results in this kind of short-sighted idiocy. Training is expensive, but not training is even more so (although in ways that don’t readily show up in KPIs).
Maybe it’s personal preference? I keep my ribbon visible on my 15″ laptop but hide it on my tablet.
There is another issue, and that is as people grow older they find it more difficult to learn new technologies. That’s not to say they can’t – my father in his seventies is still an early adopter of many new techs, having been at IBM all his career.
In my career, I’ve always been the one to get a new software in, figure out how it works, and teach everyone else. But now in my 40’s this is getting harder and harder; and as Stephen Bounds says above, constantly changing interfaces irritate me to no end now – even to the point of ditching things if I’m sick of re-learning it.
So my point is, the ETG are still pretty young, and how you view introducing new tech into your workplace may vary depending on the average age of your workforce. Even the ETG may get tired of learning new tech as they hit their 40’s and 50’s.
And btw, the multi-tasking thing? Using multiple tech at the same time? The ETG may embrace it (as mothers for eons have had to 🙂 but that doesn’t necessarily make it a good thing, and even the ETG may come to realize that. Multi-tasking makes for poorer performance at each task. It may be fine for rec activities like watching sports and commenting to your online community about the sport you’re all watching; but you’d want to make a very good case for it in a working environment before implementing something like that.
Something I’ve noticed is that many people, as they age, increasingly decide they’ve learned what they’re going to learn in this life. I understand your point about new interfaces being irritating. Me too, and I find I have to be in the mood to figure one out, much more so than a couple of decades ago.
On multi-tasking, I deliberately chose “multiplexing” instead. I think the ETG is much better at keeping track of multiple information channels without it becoming distracting or confusing. When the time comes to do actual work, though, it’s a different matter.
I can’t wait for my first opportunity to use the word “neocodger” in conversation!