Speaking of thinking about thinking, it’s time to talk about systems thinking – a topic spotlighted in Peter Senge’s groundbreaking The Fifth Discipline (1990).

Systems thinking is about breaking any whole into not only its component parts, sub-parts, and sub-sub-parts ad infinitum, but also their sub-parts’ interrelationships. It’s a big subject – so big there’s no point trying to provide links to useful sources.

Most likely, you’re already familiar with many of systems-thinking’s tools and techniques. They’re essential for just about any IT-related endeavor.

But there’s a difference between, to take one example among many, using a swim-lane diagram to design a business process and using one to think through who should be responsible for what, to decide if that’s really such a good idea.

What follows are a few tools, techniques, and principles I’ve found useful over the years to help me think things through.

Law of 7 ± 2: This really should be the law of 7, + 2 if absolutely necessary. What it means is that whatever technique or tool you’re using to figure out a system, no representation should ever have more than seven elements in it – nine if you just have to. If you need more, make one or more of the elements more inclusive – a category of elements. Think through its complexity in a separate, cross-referenced diagram that also follows the law of 7, + 2 if absolutely necessary.

Why seven? Humans can easily grasp a collection of seven or fewer items. More than that and we have to inspect. We can verify the correctness of a diagram that obeys the law. With more elements we’re left hoping we didn’t miss anything.

Black box: If you’re trying to understand a business function – a process or practice – black-box thinking is essential. Black box descriptions describe what something does, not how it does it. They cover five subjects (thereby obeying the Law of 7, + 2 if absolutely necessary):

  • Outputs – what the function is for. They’re its products and byproducts.
  • Inputs – the raw materials the function converts to outputs. Any input not needed by at least one output isn’t an input at all. It’s a distraction, unless it suggests a missing output.
  • Resources – the tools the function uses to turn inputs into outputs.
  • Constraints – restrictions on what the process does or how it does it.
  • Controls – mechanisms for adjusting process operation, such as turning a process on or off, changing its speed, or setting a configurable output characteristic.

Relationships: Six ways each system element can influence other system elements:

  • Component – an element could be part of another element.
  • Container – conversely, other elements could be part of the element.
  • Output / Input – one element’s outputs could serve as another element’s inputs, and vice versa.
  • Resource – one element could produce resources used by other elements.
  • Constraint – one element could place constraints on other elements.
  • Controls – one element could affect the operation or characteristics of other elements.

Feedback: A special type of relationship that describes how a process’s output encourages (positive feedback) or discourages (negative feedback) its own operation or characteristics. Diagrammatically, black-box diagrams represent feedback as arrows connecting process outputs to their controls.

As a matter of nomenclature, “positive” and “negative” are unfortunate word choices. They suggest feedback is either complimentary or critical.

From a systems perspective, positive feedback drives a process to produce more of whatever it is that it does; negative feedback causes it to produce less. Positive feedback accelerates a process, negative feedback keeps a process in control.

Output / Input Loops: Not all loops are feedback loops. Some make one or more of a function’s outputs one of its own inputs. The well-known OODA loop (for Observe / Orient / Decide / Act) is an example. Some outputs of its Act sub-function are inputs to its Observe sub-function, and in fact each sub-function’s outputs include inputs to the next sub-function.

Decoupling: Eliminating or reducing the strength of system interrelationships is, as a general rule, a good idea. It makes the system more resilient by reducing the ripple effects of any changes being considered.

Bob’s last word: We’ve barely scratched the surface of systems thinking. It’s a highly consequential subject because without it, it’s all too easy to fall into the trap of focusing on a particular element, optimizing it without taking the change’s ripple effects into account.

Bob’s sales pitch: The next “CIO Survival Guide” is ready on CIO.com for your viewing pleasure. It’s titled “The Hard Truth of IT Metrics,” and I think you’ll find it useful. And as always, if you have any feedback for me (see above), please share your thoughts about it.

I knew a guy who based all of his decisions on colorful anecdotes he’d amassed over a lifetime of varied experiences. He succeeded at everything he tried. Let me tell you about him.

Let’s pretend I actually did know a person like this, and that I had enough imagination, creativity, and recursion to turn their life into an anecdote about how relying on anecdotes works really, really well. Would you find my conclusion convincing?

Of course not. Turning the famous quote around, the KJR community recognizes that anecdote isn’t the singular of data.

But unconsciously turning a vivid anecdote into a trend or truth is an easy cognitive trap to fall into, even for the wary.

We’re still thinking about thinking – a big subject. Interestingly enough, my haphazard (as opposed to random) research found an order of magnitude more sources listing different forms of fallacious thinking than provided tools for thinking well.

We’ve been exploring some of these over the past few weeks. This week: what I call “anti-anecdotal thinking” but should probably call “anti-anti-anecdotal thinking.”

Start with what anecdotes aren’t: Evidence that some idea or other is valid.

Bigotry relies on anecdotes-as-evidence. The bigot finds something heinous that happened and identifies as perpetrator a member of a group the bigot doesn’t like. The bigot relates the anecdote as proof all members of the group are horrible sub-human beings and we need to do something about them.

Extrapolate from an anecdote and you’re performing statistics on a sample size of one. It’s worthless.

But that doesn’t mean anecdotes are worthless.

Anecdotes are akin to analogies. Using either one to persuade violates the rules of logic. But they’re excellent tools for illustrating and clarifying your meaning.

Anecdotes serve another useful purpose as well: While generalizing from an anecdote is bad statistics, using an anecdote to demonstrate that the seeming impossible is, in fact, achievable can make all kinds of sense, as explained in “Look to the Outliers” (Sujata Gupta, Science News, 2/26/2022):

Northern Somalia’s economy relies heavily on livestock. About 80 percent of the region’s annual exports are meat, milk and wool from sheep and other animals. Yet years of drought have depleted the region’s grazing lands. By zeroing in on a few villages that have defied the odds and maintained healthy rangelands, an international team of researchers is asking if those rare successes might hold the secret to restoring rangelands elsewhere.

The article adds: Statistically speaking, success stories like those Somali villages with sustainable grazing are the outliers, says Basma Albanna, a development researcher at the University of Manchester in England. “The business as usual is that when you have outliers in data, you take them out.

Investigating outliers can offer new and valuable insights.

Anecdotes don’t necessarily describe outliers. But just as “Man bites dog” is news while “Dog bites man” isn’t, there’s rarely much point to relating an anecdote that describes the ordinary.

Combining anti-anecdotal and anti-anti-anecdotal thinking into a single merged thought process is a useful way to explore a subject:

Anecdote: The media would have you believe ransomware is a huge problem. But I talked to a CIO whose company was hit. He told me they just restored everything from backup and were up and running in a day.

Anecdotal thinking: Once again we’re being lied to by the lamestream media! Ransomware is the new Y2K – a bogus non-crisis pushed by IT to inflate its budget.

Anti-anecdote response: Anyone can relate an anecdote. That doesn’t mean it really happened. Even if it did, that doesn’t mean restoring from backups is all any company has to do to avoid being damaged by an attack. We’ll stick with our best-practices program.

Anti-anti-anecdote response: Most likely this is just an anecdote. But it would be worth finding out if an IT shop truly has figured out a simple way to recover from a ransomware attack, and if so, if their situation is typical enough that other companies can benefit from their experience.

Bob’s last word: This week’s punchline is simple. If someone uses an anecdote to try to convince of something, skepticism should rule the day. But if they use one to try to convince you something is possible, don’t reject it out of hand. It’s as Michael Shermer, publisher of Skeptic magazine advised: “The rub … is finding that balance between being open-minded enough to accept radical new ideas but not so open-minded that your brains fall out.”

Bob’s sales pitch: My formula for deciding what to write about each week includes, seasonally enough, four questions: (1) Do readers care about the subject? (2) Do I know anything about it? (3) Do I have anything original to say about it? And, (4) have I written about it recently?

I have 2, 3, and 4 covered. But it sure would help if you’d write to suggest subjects you’d like me to cover.