The leisure society was once a science fiction staple. These were stories built on a world in which production was so largely automated, and society was, as a consequence, so wealthy that most people lived comfortably on what amounted to but wasn’t called welfare.

Speaking arithmetically but not politically or macroeconomically, here in the U.S. of A., with an average annual per-household income that exceeds $110,000, we could live in this society whenever we chose.

And as a society we might need to start a serious conversation along these lines fairly soon. Why?

We’ve now reached pre-crash employment levels, but that’s the number of jobs.

While corporate profits are way beyond their 2007 levels, the annual compensation for your average job is much lower. The logical conclusion? Companies didn’t need many of the high-paid employees they laid off during the Great Recession in the first place.

You can be sure automation is an important reason, in spite of the suspicion common among business executives that left to its own devices, IT would spend on “technology for technology’s sake.”

IT never did spend on technology for technology’s sake, of course, although it did operate under the assumption that automating manual work made a company more efficient … an entirely reasonable assumption.

But accuse someone of something often enough and they’ll become timid and defenseless, so IT stopped actively looking for automation opportunities and instead actively participated in the establishment of increasingly elaborate governance mechanisms designed to prevent a problem that had never existed in the first place.

Enter Generation Whatever. Call them Millennials. Call them Recent Teenagers. Call them the Embedded Technology Generation (ETG).

Businesses are increasingly virtual. More and more employees have no employer-provided office. Taking my colleagues and myself in Dell Global Business Consulting as an example, we’ve all met fewer of each other face-to-face than otherwise, and yet we’re able to function in teams more or less on demand.

And none of us belong to the ETG, which means we’re somewhat less likely to use all the tools available to us to collaborate remotely, compared, that is, to employees who consider Facebook, Twitter and texting to be How People Share Ideas.

The ETG changes things. It’s time for aged managers to stop reading nonsensical articles about how, for “them,” it’s all about “me.” Of course it’s about me. This is capitalism — being all about me is a bedrock assumption, one that, in other contexts, business leaders celebrate.

It’s time for those of us in leadership roles who find ourselves geezing from time to time (you’re geezing when you criticize how others live their lives instead of enjoying your own) … where was I?

If you’re geezing too, here’s how to understand the difference between us and the ETG: your last rental car. Did you panic when you got behind the wheel, because the car’s user interface was different from the one you drive at home?

Of course not. You took a minute to orient yourself. You found the wipers, turn signals, headlight switch and so on, and whether you had to insert a key or just push the start button. And off you went.

That’s how the ETG thinks about technology. They don’t just figure it out — they expect to figure it out, and then they use Google to find possibilities they weren’t able to figure out for themselves.

I say “they” because I’m only about halfway there, and I say this with some regret.

Here’s what’s even more regrettable:

In more companies than not, information technology adroitness isn’t even considered important enough to be part of an employee’s performance appraisal. Sending documents to team members as attachments is, for example, considered just as acceptable as sending links to the central SharePoint copy, no matter how much more difficult it makes merging edits and new sections, and no matter how much chaos it creates in the form of everyone having to figure out which version is the current version.

The thought of authoring a project deliverable as a wiki? What’s a wiki? Of creating a client presentation with Prezi instead of PowerPoint? Uh uh. Prezi isn’t company-standard software, and besides, what’s Prezi?

Astonishingly, in many companies, it wouldn’t occur to someone who doesn’t know the answer to Google it to find out.

Meanwhile, the ETG would have installed it at home and figured it out, because that’s what you do with interesting technology.

If corporations are people too, maybe they should start to think this way, too.

It’s time for corporate America to join the ETG.

In state fair circles, deep fried butter on a stick is considered a delicacy. In reasons-to-distrust-scientists circles, some of the more recent findings that cast doubt on the fat-to-bad-health linkage are also considered delicious.

Me, I’m skeptical of the skepticism. If you aren’t, go ahead and eat as much deep fried butter on a stick as you like. I’m sure it’s the road to robust good health. And if it turns out not to be the road to robust good health, I’m just as sure the Tooth Fairy and Santa Claus are skilled at anesthesiology and open heart surgery, respectively, so you’ll still be okay.

Something I’m less skeptical about is the so-called “digital revolution” in business — as described last week, the confluence of social media, cloud computing, big data, the next-generation workforce, smart products, the “internet of things,” mobile computing, and probably one or two other Next Big Things.

The KJR short version of the digital revolution: In the world, technology is pervasive. In many businesses it’s still a case-by-case decision.

Yes, there is a certain overhyped trendiness to the “digital revolution.”

But there is a revolution in the works. What it is: If a computer can do something, someone is making a computer do it somewhere. They’re probably giving away an adware-supported dumbed down version for next to nothing, too, to generate a bit of revenue and more than a bit of interest.

We’ve entered, that is, an era in which computerization is assumed, and is no longer considered the least bit remarkable or threatening.

And, we’ve entered another era of experimentation — one that rivals the experimentation triggered by the original personal computer in the late 1970s and early 1980s, by the World Wide Web in the 1990s, and, for that matter, the literary and artistic experimentation that accompanied the 1960s.

I’ll leave it to you as to whether Andy Warhol was more or less important to society than Reddit.

The hyping question is whether the digital revolution technologies — social media, mobile, big-data and the rest — will provide as much “value add” for business as, say the database management system did when it first appeared.

You have to be careful about “value-add.” It should include avoidance of “value subtract.” Just because something added its value a long time ago and has faded into the infrastructure, that doesn’t mean it’s any less present and important.

Like the DBMS. When these puppies were new and shiny, IT had to carefully and thoroughly demonstrate their value-add. Being new, and expensive, and — from the perspective of those who have thumbs-up/thumbs-down authority over capital proposals — IT’s latest and greatest shiny ball, the need for this latest latest-and-greatest technology was highly controversial.

35 years later, the notion of building a useful business application without a DBMS is somewhere between quaint and stupid. The only question is which one to use.

The DBMS is as much an assumed part of the IT infrastructure as Twitter is an assumed part of the political and celebrity infrastructure: Even if you don’t know exactly how it works, you know it’s out there and a lot of people seem to rely on it.

Developers building a new application assume they have a DBMS — it’s just there, just as for political consultants building a new campaign, Twitter is just there.

In the world, the digital-revolution technologies are just there. Many businesses, though, aren’t baked that way. They need a path out of their business archaism. Here’s a three-step program to get them there:

Step 1 is KJR’s increasingly tiresome stump speech about there being no such thing as an IT project. Or shouldn’t be. It’s always about business change — designed business change — or it’s a project with no point. And while this statement ought to be clear, in my experience it isn’t, so … no, this isn’t the same thing as saying IT projects should always have business benefit. Once more, with feeling: Projects. Are. Always. About. Business. Change. Earth. Person.

Step 2 is for business executives, and for that matter for managers at all levels, to consider knowledge of information technology to be part of their job description.

Not how it works. Not how to implement it. This being the 21st century and all, those who run businesses ought to understand the fundamentals of what they run on.

Step 3? Stop making case-by-case decisions about technology.

No, that isn’t quite right. Executives will still have case-by-case decisions to make about the use of technology.

Only now, the case-by-case decisions will be when to not use it.