HomeShadow IT

Plausible blame

Like Tweet Pin it Share Share Email

Call it plausible blame.

A frequent correspondent (who wasn’t, by the way, endorsing it) brought an interview with Thomas Sowell in The Federalist to my attention. In it, Sowell says:

… just the other day I came across an article about how employers setting up new factories in the United States have been deliberately locating those factories away from concentrations of black populations because they find it costlier to hire blacks than to hire whites with the same qualifications. The reason is that the way civil rights laws are interpreted, it is so easy to start a discrimination lawsuit which can go on for years and cost millions of dollars regardless of the outcome.

Shall we deconstruct it?

Start with Sowell’s evidence: he “came across an article.” That isn’t evidence. It’s an unsubstantiated assertion once removed. And … uh oh … I came across an article too. Turns out, fewer than half of all EEOC filings are based on race or color; for claims where the plaintiff wins the average settlement is $160,000. That isn’t a small number, but at best it’s a tenth of Sowell’s claimed “millions of dollars.”

Oh, and presumably some of the plaintiff wins were due to actual harassment or discrimination.

And the “evidence” is stronger than the rest of Sowell’s claim. If you’ve ever been involved even slightly in business decisions like where to locate a factory, you know the process is far too complicated to give discrimination-lawsuit-prevention-by-avoiding-populations-with-too-many-potential-lawsuit-filers a determining role.

Or, for that matter, any role at all.

The underlying message, though, is pretty clear: government programs to correct social ills backfire, so those who propose them are misguided.

Only there’s no evidence that the problem even exists, and its purported root cause doesn’t stand up to even the slightest scrutiny.

That’s why I call it “plausible blame:” The stated problem isn’t real, but plausibly could be. The blame for the problem is plausibly ascribed to a group the blamer wants to disparage, with “plausibly” defined as “sufficient to support confirmation bias.”

Which brings us to Shadow IT, as you knew it would.

I’ve been reading about Shadow IT and its enormous risks. Why, just a few weekends ago, Shadow IT took down Target’s point-of-sale terminals in 1,900 or so stores.

Oh, wait, that wasn’t Shadow IT. At least, it probably wasn’t. We don’t know because all Target has divulged about the outage is that its cause was an “internal technology problem” that didn’t result in a data breach.

That’s unlike Target’s massive 2013 data breach, which was due to Shadow IT.

It wasn’t? Sorry. Bad memory.

In case you’re unfamiliar with the term, “Shadow IT” is Professional IT’s term for unsanctioned do-it-yourself IT projects taken on by business departments without the benefit of the IT organization’s expertise. With all the bad press Shadow IT gets, I figured it must have been the root cause of at least one major outage or data loss event.

But google “data breach” and while you’ll find a rich vein of newsworthy events, none had anything to do with Shadow IT.

This is plausible blame too. The problem hasn’t been documented as real, and fault for the undocumented problem is assigned based on superficially sound logic that doesn’t stand up to close scrutiny.

Plausible blame is a handy way to make us despise and direct our anger at some group or other. Shadow IT’s undocumented perils, for example, lead IT professionals already predisposed to disrespect end users (see “Wite-Out® on the screen“) to sneer at the clueless business managers who encourage it.

And it is plausible: Information Security professionals know what to look for in assessing the vulnerability of potential IT implementations — a lot more than do-it-yourselfers. Sometimes they know so much that applying that knowledge cripples creativity and initiative.

Make no mistake, Shadow IT does entail real risk. But stamping it out ignores the even greater risks associated with manual methods. Risks? Yes. Few IT organizations have the bandwidth to attend to every automation opportunity in the enterprise. Insisting on nothing but manual methods for everything else means operating far less efficiently and effectively than possible.

Logic says Shadow IT entails some risk. The evidence says professional IT is, in its own ways, just as risky. Plausible blame says Information Security should focus its attention on Shadow IT.

My conclusion: plausible blame is riskier.

Comments (12)

  • Usually, I’m total agreement with your columns. However, for this one, I mentally started to list my differences of opinion and realized that those differences didn’t matter, since I agreed with your conclusion. And also realized that some HIPAA breaches that I could think of that didn’t involve ‘professional IT’ also didn’t involve shadow IT – they involved those oblivious to IT.

  • Well done! Applying logic and additional research to expose false starting assumptions and confirmation bias in contemporary contexts, ending with comparative risk assessment—hits many of the relevant decision factors. Hope some heads of IT take this seriously; home-grown local solutions can be good starting points for “proof of concept” solutions.

  • Up against the hobgoblin of Shadow IT, you raise the specter of “manual methods”. I’m wondering why this purportedly far riskier and thus more likely cause that you seem to be suggesting for outages for which we still have no explanation isn’t just another example of plausible blame.

    Guessing what you mean by manual methods vs. automation: IT environments, applications, etc. that are established, configured and altered by direct human intervention rather than fabricated automatically by proven, tested, bulletproof solutions that purposely minimize the necessity for human choice and error. If I’m on the mark with that definition, then I’m totally with you on the need for automation.

    But, logically, aren’t one-off Shadow IT solutions that don’t follow the IT playbook also an example of manual methods, under even less control, so therefore including not only the purported risks of Shadow IT but also the elevated risk you suggest for interactive methods?

    I don’t know the causes of many major breaches, but I would guess that many are caused by the what: what patches and firewall rules are applied and what configuration choices are made, rather than the how: whether the installations and configurations are done and monitored automatically or not, or the who: IT Proper or Shadow IT users. It seems more likely on the surface that experienced, trained IT staff whose job it is to know what configurations are best and to implement them and keep them updated will do a better job of that in general by whatever method. Any elevated risk with IT in comparison is probably due to the scale, criticality and exposure of their work rather than to methods.

    I’m not against Shadow IT as a way of encouraging creativity, I just don’t agree with the logical construction of this article. Maybe it’s because it’s because your reaction to the article was the actual point and the IT parallel was just a vague swipe at justifying the post in this venue.

    • I confess it never occurred to me that Shadow IT – information technology implemented without the involvement of the IT organization – might be classified as manual methods. To my way of thinking, information technology is what it is regardless of the organizational affiliation of the implementer.

      That is, if Sales Management signs a contract with Salesforce in one company, and IT signs an identical contract with Salesforce in a different company, that can’t mean using Salesforce constitutes manual methods in one and automation in the other.

      My point about manual methods being risky is that they entail the risk of operating with far less efficiency than competitors that automate the same business functions. Less efficiency means, among other things, higher incremental costs, leading to either lower margins or less competitive prices.

      I think it’s fair to call the possibility of an action leading to lower margins or less competitive prices to be a risk.

  • This column struck a nerve—a nice one somewhere in my pleasure center. I was an early member of the Shadow IT corp, an architect-in-training at a major hotel corporation who was seduced into the conspiracy by a director of marketing. It was a typical day in the mid-’80s when he plopped an Apple II into a vacant cubicle with the invitation “Anyone who wants to try out this computer, let me know.”

    Being cautious, despite my interest in tech (and a roomie who had an Altair 8800 in his bedroom), I was afraid to turn it on without guidance. But I did.

    Two years later, I had used Visicalc to replace all the project management procedures in the architecture and interiors departments and was on the way to computerizing all the interior design teams with a dBase III program for writing furniture specs.

    We went through TRS-80s and settled on the new IBM PC as the base system, while I also added an Intergraph VAX-based CAD system to the mix. My architecture license came and eventually went. These were the days when corporate mainframe people would visit to see what was happening and with complete sincerity say “Wow, I can’t believe what you can do with a PC!”

    That first dBase program that completely changed how the designers did documentation? It had NO concept of parent-child relationships or surrogate keys or anything I now use. But it worked and was leagues ahead of what they were doing before.

    It is hard to believe that any mistakes made by IT “amateurs” could ever outweigh the value that they brought to all the middle-managed departments across the country with opportunities neglected by corporate IT budgets. They were a dynamic confluence of two things: Just enough knowledge to be effective and in-depth knowledge of what needed to be done. The drawbacks of Shadow IT are, I think, easy to solve: Enthusiastically support it with practical training.

  • I like your deconstruction and I agree that “Plausible blame is a handy way to make us despise and direct our anger at some group or other.” Despising others and anger at others does not help see problems clearly or solve them quickly. Not sure I follow all of your points. I wonder if you would classify the JPL breach via Raspberry Pi a shadow IT problem? It certainly exacerbated the professional IT weaknesses. https://www.engadget.com/2019/06/20/nasa-jpl-cybersecurity-weaknesses/

    • Fair enough – this was a problem caused by completely unconstrained shadow IT, although as the article makes clear, even the most rudimentary set of controls would have caught the offending machine long before any damage was done. I’m not advocating a shadow IT free-for-all. I’m advocating DIY computing supported and encouraged by professional IT, with support and encouragement including an ounce of security prevention along the way.

      • “Security prevention”? I wouldn’t, prevent security too much. But hey, maybe that’s just me ;^)

  • Plausible blame is a tool, and like any tool it can be useful or harmful (and sometimes both at the same time).

    The core of plausible blame is, of course, blame. The most harmful aspect of plausible blame may be that it redirects energy in the organization away from solving problems and towards politics and infighting. At the very least, it misdirects attention away from the failings of the one using it. If the official IT group blames others for their non-professional application of IT, then one is discussing other groups and not examining the performance of the official IT group.

    Effective managers can spot plausible blame and convert it to positive energy, focusing on the desired behaviors and results in the organization.

  • Great article, useful in both social and IT contexts.

    Plausible blame is a great tool for power grabbing by a minority that de-legitimizing and eventually disenfranchising the target group through false assumptions and examples, that feel true, but are based on facts of emotions rather than empirical evidence.

    This is toxic regardless of the context.

    It is an analysis I had not seen before. Thank you.

  • Just read an article about moving data to the cloud and one of the commenters said expect all your data to end up in the hands of hackers.

    I thought that was a totally bogus callout. imo, if you have data anywhere’s close to an internet connection, expect hackers to get it. The cloud vs a server room doesn’t matter as far as security is concerned. (plausible blame–say a true thing but try to limit the subject area you talk about).

    In that particular case, it seems to me that there is a logical or at least discoverable point at which any company below a certain size in the IT department should move to the cloud and take advantage of the minimal security provided by the low-level helpdesk workers and powerfully-speaking salesfolk. These are small businesses like local bakeries and stores.

    Moving on up, if you have real security needs, the cloud provider can probably do it better than your group (unless you have at least 1,000 IT employees) BUT IT WON’T HAPPEN AUTOMATICALLY. Whoever runs the contract needs to demand the security and I suspect that’s where a lot of the true “danger” of the cloud lies–the disconnect between IT and business.

    That disconnect gets worse when you move to contracting. But it’s easier to blame the cloud than your own management, and a hell of a lot better for job security.

    • Seems to me that one argument in favor of keeping my personal data on my laptop instead of with a cloud provider is that the cloud provider is a bigger and more visible target.

      Counterbalancing that argument is that, as you say, the cloud provider knows a lot more about information security than I do and so will have stronger countermeasures in place.

      Beats me where the balance point is.

Comments are closed.