Draw a Venn diagram. Label one of the circles “What I’m good at.” Label the next “What I enjoy doing.” The third reads, “What someone will pay me to do.

Where the three intersect? That’s your career, if you want one. It’s also the core  framework hiring managers have in the backs of their minds when trying to staff their organizations.

They’re accustomed to hiring employees. They bring in contractors – independent workers, also known as members of the gig economy – for situations that call for individuals with a well-defined “sack o’ skills” for a finite duration.

Contractors are, that is, members of the workforce who have decided they won’t scratch their circle #2 itches through their careers. Their numbers appear to be increasing, very likely as an offset to those who prefer the traditional employment/career approach to earning a living.

Managers generally think of their organization as a social construct. When staffing a role, hiring an employee is their default, and for good reason. They want someone who will do more than just a defined body of work. Beyond that they want people who will pitch in to help the society function smoothly, who will provide knowledge and continuity, who find this dynamic desirable, and whose attitudes and approaches are compatible with the business culture.

Bringing in a contractor is, for most open positions, Plan B.

Which is unfortunate for hiring managers right now. The trend appears to be that if they want enough people to get the organization’s work done they’re going to have to make more use of contractors … and not only contractors but also employees who have no interest in pursuing a career, just an honest day’s pay in exchange for their honest day’s work – who want jobs, not careers.

A different approach to staffing to what we’ve all become accustomed to is evolving, one that’s more transactional and less interpersonal. Culture will be less of a force because contractors will spend less time acculturating than employees; also, the ratio of time working independently than in the team situations where culture matters most is steadily increasing.

In some respects it will be more expensive. Contractor turnover will be higher than employee turnover because that’s built into how the relationship is defined. The ratio of onboarding time to productive time will increase.

Managers who don’t want to head down this road do have an alternative: They can compete for those members of the workforce who don’t want to become independent. The law of supply and demand suggests that this approach will cost more. It will also mean thinking through how to make the work environment as desirable as possible.

One more factor, as if one was needed: The security ramifications of a more transient workforce are significant.

Bob’s last word: “Digital” refers to changes in a company’s marketplace that call for changes in a company’s business strategy in response. Digital is all about products and customer relationships.

The current restructuring of traditional staffing practices is the result of digitization, the rise of the remote worker digital technologies have enabled, and COVID-19, which accelerated it all. It’s the next digital marketplace transformation to which businesses must adapt, only this time the marketplace in question is the one that trades in labor.

Adapting to this nascent transformation of the employment marketplace is less familiar territory, but it isn’t different in principle. Strategists have always had to think in terms of where their organizations fit into an overall business ecosystem. Staffing has always been part of this overall ecosystem. It’s just that few business leaders, not to mention those of us who engage in punditry and futurism … anticipated how quickly and dramatically this ecosystem would morph.

Bob’s sales pitch: Ten years ago, when I published Keep the Joint Running: A Manifesto for 21st Century IT, “Digital” was still an adjective, “everybody knew” the rest of the business was IT’s internal customer, and “best practice” was a phrase people tossed around when they had nothing better to say.

Oh, well. You can’t win ‘em all. But even though Digital has been noun-ified, this book’s 13 principles for leading an effective IT organization are as relevant as the day the book was published.

In case you missed the news, Israeli scientists have taught a goldfish how to drive.

Well, not exactly. They placed it in a bowl with various sensors and actuators, and it correlated its initially random movements to which movements moved it toward food.

The goldfish, that is, figured out how to drive the way DeepMind figured out how to win at Atari games.

This is the technology – machine-learning AI – whose proponents advocate using for business decision-making.

I say we should turn over business decision-making to goldfish, not machine learning AIs. They cost less and ask for nothing except food flakes and an occasional aquarium cleaning. They’ll even reproduce, creating new business decision-makers far more cheaply than any manufactured neural network.

And with what we’re learning about epigenetic heritability, it’s even possible their offspring will be pre-trained when they hatch.

It’s just the future we’ve all dreamed of: If we have jobs at all we’ll find ourselves studying ichthyology to get better at “managing up.” Meanwhile, our various piscine overseers will vie for the best corner koi ponds.

Which brings us to a subject I can’t believe I haven’t written about before: the Human/Machine Relationship Index, or HMRI, which Scott Lee and I introduced in The Cognitive Enterprise (Meghan-Kiffer Press, 2015). It’s a metric useful for planning where and how to incorporate artificial intelligence technologies, included but not limited to machine learning, into the enterprise.

The HMRI ranges from +2 to -2. The more positive the number, the more humans remain in control.

And no, just because somewhere back in the technology’s history a programmer was involved that doesn’t mean the HMRI = +2. The HMRI describes the technology in action, not in development. To give you a sense of how it works:

+2: Humans are in charge. Examples: industrial robots, Davinci surgical robots.

+1: Humans can choose to obey or ignore the technology. Examples: GPS navigation, cruise control.

0: Technology provides information and other capabilities to humans. Examples:Traditional information systems, like ERP and CRM suites.

-1: Humans must obey. Machines tell humans what they must do. Examples: Automated Call Distributors, Business Process Automation.

-2: All humans within the AI’s domain must obey. Machines set their own agenda, decide what’s needed to achieve it, and, if humans are needed, tell them what to do and when to do it. Potential examples: AI-based medical diagnostics and prescribed therapies, AIs added to boards of directors, Skynet.

A lot of what I’ve read over the years regarding AI’s potential in the enterprise talks about freeing up humans to “do what humans do best.”

The theory, if I might use the term “theory” in its “please believe this utterly preposterous propaganda” sense, is that humans are intrinsically better than machines with respect to some sorts of capabilities. Common examples are judgment, innovation, and the ability to deal with exceptions.

But judgment is exactly what machine learning’s proponents are working hard to get machines to do – to find patterns in masses of data that will help business leaders prevent the bad judgement of employees they don’t, if we’re being honest with each other, trust very much.

As for innovation, what fraction of the workforce is encouraged to innovate and are in a position to do so and to make their innovations real? The answer is, almost none because even if an employee comes up with an innovative idea, there’s no budget to support it, no time in their schedule to work on it, and lots of political infighting it has to integrate into.

That leaves exceptions. But the most acceptable way of handling exceptions is to massage them into a form the established business processes … now executed by automation … can handle. Oh, well.

Bob’s last word: Back in the 20th century I contrasted mainframe and personal computing systems architectures: Mainframe architectures place technology at the core and human beings at the periphery, feeding and caring for it so it keeps on keeping on. Personal computing, in contrast, puts a human being in the middle and serves as a gateway to a universe of resources.

Machine learning is a replay. We can either put machines at the heart of things, relegating to humans only what machines can’t master, or we can think in terms of computer-enhanced humanity – something we experience every day with GPS and Wikipedia.

Yes, computer-enhanced humanity is messier. But given a choice, I’d like our collective HMRI to be a positive number.

Bob’s sales pitch: CIO.com is running the most recent addition to my IT 101 series. It’s titled The savvy CIO’s secret weapon: Your IT team | CIO .