As a card-carrying member of the KJR community, you know our guts are optimized for digestion, not for a dominant role in the executive decision-support system.
While watching The Loudest Voice — a biopic chronicling the rise and fall of Fox News’ Roger Ailes — it occurred to me that, when making decisions, deciding whether or not we should rely on our intestines is less consequential than deciding who we trust to provide us with information and insights.
The Loudest Voice, for example, tells a compelling tale. Much of it is or could reasonably have been supported by sources close to Ailes and Fox News. But some of the story depicts private circumstances, especially between Ailes and his wife Elizabeth, for which the scriptwriters could not plausibly have had any reliable sources to draw on.
Scratch The Loudest Voice off my list of places to get insights into conservative media.
But the story as told was compelling (and Russell Crowe’s performance as Ailes was brilliant). Were I a left-wing partisan I’d have been vulnerable to accepting the entire production as history, not just “based on a true story.”
Which brings us to the opinions we form and the decisions we make, not only in our personal livee as citizens and voters, but as managers and professionals as well.
How do you decide which of your potential information sources you can trust? And if you find yourself disagreeing with folks you need to persuade, how do you pry them loose from the information sources they rely on … usually, one or more of the big three analyst firms (Gartner, Forrester, IDC) … to more reliable sources such as Keep the Joint Running, or, even better (for you), you and your colleagues who will have to turn CIO decisions into practical action?
Here’s a starting point: Have some. Information sources, that is.
Take time … make time … to read, about developments in your areas of specialization, and, even more important, where you don’t specialize.
As you read, pay attention to your own confirmation bias.
Read critically, but not so critically that you ignore ideas and trends you should be knowledgeable about.
But on the other hand, we all need to pay special attention to the other side of our confirmation biases, uncritically accepting sources we like, or that tell us what we want to hear.
In the political world, that’s how QAnon has gained influence. Political partisans start with the desire for their own opinions to dominate. That easily turns into a need to dislike those they disagree with — for their opponents to be bad people. Once I need my opponents to be bad people it’s just one small step for me to seek out information sources that disparage them.
In the IT world we don’t (yet) have any QAnons to worry about. Nobody reads an IT opinion piece because it vilifies … well, maybe we do.
Imagine you’re on a solution selection project and have developed a preference for one of the candidates. Now imagine the team seems to be leaning to a different candidate, one you’re far more skeptical of.
As we’re dealing in hypotheticals, next imagine you search for industry evaluations that back your position. You run across a Gartner Magic Quadrant that places your preferred solution in the prized “Leader” quadrant while scoring the one you dislike as a hopeless loser (“Niche” in GartnerSpeak).
I don’t know about you, but my inclination would be to immediately share Gartner’s views with the selection team.
But if Gartner’ analysis ran the other way, I’d probably search for a second opinion. Imagine what I found was a hatchet job that cast aspersions, not only on Gartner’s methodology, but on its objectivity and integrity as well. Would I be tempted to share that with the team?
Of course I’d be tempted. Would I actually share it? I hope that if the critique in question was based solely on hypotheticals, with no actual evidence to back it up, I’d resist the temptation.
I hope.
Sharing that sort of thing wouldn’t be QAnon-grade conspiracy-theory mongering. But it would be a step in that general direction, especially because the act of sharing it doesn’t just influence the people around me. It also sets up the vicious cycle of selecting what I read based on my likes and dislikes, reinforcing them.
Which in turn leads me to make my future information sourcing choices searches, not for information, but for ammunition.
And that’s the point this week: We need to choose our information sources carefully. Choose none and we’re just ignorant.
But choosing the wrong ones will make us worse than ignorant.
It will make us deluded.
This is a surprisingly negative view from you. I hope you are wrong.
Good column. Information doesn’t inform if it isn’t true.
One of the key points for most any technological decision is: how any other folks will adopt it? Whether you like it or not is irrelevant. JavaScript for example. I don’t think anyone would choose it on its merits, but it is widely used. (And there are holy wars on UseNet and that new-fangled web dealibobber over these things).
For myself, I usually start with my own confirmation bias and try to decide if I have any actual facts or if its mainly familiarity with a slogan or TLA. Then I go at it like a mathematician. You try to prove your theory by day and try to disprove it by night.
VERY powerful concluding sentences. I copied to my ‘quotes to remember’ document.
I recently had a message exchange with someone of opposing political thought. She pointed to a Congressional action that she said proved her point. When I went to the Congressional ,gov page, I found she was referring to the vote on an amendment to the bill, not the actual bill. When voting on passage, the 2 parties voted exactly opposite the way they voted on the amendment [which was curious, but I couldn’t find out why]. When I pointed to the .gov page, to show that her party had NOT supported the bill she was claiming, she said that the .gov page had been corrupted and was showing fake information. (!!) I tried to point out her misunderstanding. She blocked me. Once deluded, people are impervious to facts.
I agree. It’s sad, but when faced with a wrestling match between what someone believes and the world, the belief will win two falls out of three.
Excellent insight as always.
I use the litmus test of, “Would I stake my reputation on this information?”
It’s like the old accounting adage – if you crunch your numbers hard enough they say whatever you want them to say.