Think of this as KJR’s
pledge week.
No, I’m not asking for
donations. I’m asking for your time and attention before you let yourself read
this week’s missive.
Specifically, when I
decide what to write about each week I’m doing too much guessing based on too
little information. I’m asking you to let me know what you’d like me to cover
this year in KJR, and, almost as important, what I should be writing about so
your non-KJR-subscribing colleagues would find it more compelling.
One more thing: My
ManagementSpeak inventory is running low. After 23 years of one a week it’s
entirely possible there just isn’t all that much more bafflegab to translate.
So instead, if you don’t hear something that deserves the ManagementSpeak
treatment, send me your favorite quotes instead.
I will ask you to
apply one filter on these: As with all things KJR I’m looking for what’s off
the beaten path — dictums that haven’t yet been widely discovered but deserve
to be read by a discerning audience.
Okay, that’s enough
Pledge Minute. Back to this week’s KJR.
—————————————-
The private sector has discovered data. Where a decade ago,
business leaders were encouraged to trust their guts, they’re now encouraged to
trust their data scientists.
It’s role reversal. Back then, governmental policy-making was heavily data-driven. As Michael Lewis (no relation) explains in The Fifth Risk, an extensive, intensive, and essential responsibility of many cabinet-level agencies is the collection of highly valuable data and managing the databases that contain them.
Just in time for businesses to invest heavily in data
collection, management, analytics, and interpretation, the federal government
is shifting much of its policy-making to a more instinctive approach, and in
doing so is shifting budget and resources elsewhere.
Here at the Keep the
Joint Running Institute, the Joints in question are organizations of all
sizes and shapes; our charter is how to keep them running. As a general rule we
(and that’s a royal we) stay away from political matters. Politicized matters? In
bounds whenever they’re relevant.
And so (you were wondering when a point might emerge, weren’t
you?) as your organization, for all the right reasons, embraces data-driven
decision-making, here are a few cautionary notes you and your colleagues might
find helpful:
Culture before tools: If you’re a longtime subscriber you’re familiar with the idea that when trying to institutionalize data-driven decision-making, a “culture of honest inquiry” is a prerequisite for success. In case you aren’t, the principle (but not its achievement) is simple: Everyone involved wants to discover the right answer to each question, not to prove themselves right.
Solving for the
number: A culture of honest inquiry is what enlightened leaders strive for.
While still on the journey, though, be on the lookout for someone using these
new, powerful analytical tools to manipulate filters, choices of statistical
techniques, and thresholds to support their pre-determined preferred result —
for ammunition, not illumination.
GIGO: “Garbage
In, Garbage Out” was widely recognized back when IT was known as the Data Processing
department. That Big Data lets organizations collect and manipulate bigger
piles of garbage than before changes nothing: Before you release your data
lakes into the watershed, make sure your data scientists assess data quality
and provide appropriate cautions as to their use.
Ease vs Importance:
When it comes to data, some attributes are easier to measure than others. Even
professional researchers can fall prey to this fallacy — that hard to measure means
it doesn’t matter — without the blind spot ever quite reaching the threshold
of consciousness.
Interpolation is safer than extrapolation: Imagine a regression analysis that yields a statistically significant correlation. And imagine that, in your dataset, the lowest value of x is $20 and the highest value is $200. Predicting the outcome of spending $40 is a pretty safe bet.
Predicting the outcome of spending $10 or $300? Not safe at all. Straight lines don’t stay straight forever. They usually bend. You just know where the line doesn’t bend — you have no idea where it does, and in what direction.
Machine guts need skepticism, too: Machine learning depends on neural networks. It’s the nature of neural networks that they can’t explain their reasoning — mostly, they’re just very sophisticated correlation finders. They’re useful in that they can plow through a lot more data than their human counterparts. But they’re still correlations, which mean they don’t imply causation.
But of course, to us the unwary, they do.
Courage: Take a
timid business — more accurately, a business made timid by business leaders
who consider avoiding risks to be the pinnacle of business priorities. Now add
data and analytics to the mix.
What human data scientists and their AI machine-learning
brethren do is spot potentially useful patterns in the data. These patterns
will sometimes suggest profitable actions.
When all is said and done, when a pattern like this, along
with the potentially profitable actions, are put in front of a timid business
leader, much more will be said than done.
It’s unfortunate but not uncommon: Taking action is
inherently unsafe, an insight that’s true as far as it goes.
What it misses: Playing it safe is usually even more of a
risk, as competitors constantly search for ways to take your customers away
from you.
Play it too safe and not only won’t you take customers away
from them.
You’ll fail to give your own customers a reason
to stay.