The world’s first website was launched on August 6, 1991. By rights, someone should have programmed a bunch of Twitter ‘bots to sing happy birthday to the World Wide Web. (And thanks to my friend Mike Benz for pointing out this historical marker to me.)

# # #

Speaking of ‘bots, while up-to-date statistics are hard to find, and the sensational nature of the subject matter invites exaggeration, there clearly are a lot of social media ‘bots out there, and in particular there are a lot of ‘bots out there that spread misinformation, disinformation, fake news, baloney, and other forms of utterly nonsensical but dangerous propaganda.

Back when Mutual Assured Destruction was the backbone of U.S. nuclear military strategy, it was widely understood that disarmament was desirable but unilateral disarmament would have been destabilizing.

Which leads me to wonder why those who want to spread reliable, curated content don’t deploy counterpropaganda ‘bots.

Most of what we read about countering  ‘bot-driven disinformation campaigns is defensive – how to recognize the dangerous little critters. I wonder what a ‘bot arms race might look like.

# # #

Speaking of the Internet and disinformation, no, Al Gore never claimed to have invented the Internet. Al Gore also never claimed to have invented technology for countering disinformation, which is just as well given how utterly inept he was at it. As proof of his ineptitude, most Americans still seem to believe that he did claim to have invented the Internet.

# # #

Continuing to speak of the Internet and disinformation, SpotFakeNews.info has published a handy guide to recognizing disinformation. Its step-by-step is as follows (follow the link for details): (1) develop a critical mindset; (2) check the source; (3) who else is reporting the story? (4) think about the evidence; (5) don’t accept images at face value; (6) listen to your gut.

The full text behind #6 tells you to pause and ask if what you’re reading is designed to play on your hopes and fears. It tells you, that is, to do the exact opposite of listening to your gut. Go figure.

# # #

Meanwhile, as we are, after all, celebrating the birth of the World Wide Web, a quick timeline: In the beginning (of the Web, not the Internet itself) was SGML – the Standard Generalized Markup Language. It was a syntax for defining tags that could be used to identify parts of documents. Everyone who came into contact with it knew it was important. The main barrier to its adoption was that nobody could figure out anything useful for it to do.

Then CERN’s Tim Berners-Lee, wanting to make Ted Nelson’s idea of hypertext real, figured out that a simplified version of SGML could be just the ticket. He called the result the HyperText Markup Language – HTML.

To make HTML useful, Berners-Lee then created WorldWideWeb (later Nexus) – the first web browser.

Shortly thereafter, in 1993, NCSA’s Marc Andreessen and Eric Bina wrote Mosaic, the first web browser anyone ever heard of.

Somewhere in there, Al Gore sponsored legislation privatizing Internet governance and encouraging the transformation of the Internet’s underlying connectivity, from a fragile spiderweb of low-speed channels to a robust backbone-based architecture.

Imagine what the world would be like, right now at this moment as you read these words, had none of this history happened.

# # #

Bob’s last word: In the absence of a TIP program we do need tools of some kind to help us differentiate honest information sources from those whose purpose is to deceive.

One tool every information source can deploy to help its consumers judge their reliability is to reveal the processes and practices they employ to gather, process, and publish. The Washington Post provides a laudable example. You’ll find it here: Policies and Standards.

I haven’t yet prepared one for KJR, but will get started on the project shortly.

Bob’s sales pitch: Speaking once again of Internet-driven disinformation, in 1997 I proposed creation of a TIP (Trusted Information Provider) certification program. Later in 1997, and on through the present, this proposal was almost universally ignored.

But on the other hand, in 2010 the Harvard Business Review published its “10 Must Reads.” Amusingly enough, not one of the articles HBR considered must-reads made any mention of information technology or the Internet.

Nice to know they’ve been keeping up with the times, even if they aren’t keeping up with yours truly.

If you’re among those affected by the COVID-19 pandemic, read Michael Lewis’s (no relation) The Premonition: A Pandemic Story. If you aren’t, you’re probably living in New Zealand and might not find it as interesting.

I know I can’t provide a summary that does the book justice. Heck, I’m not sure I can even explain what it’s about. What I know is that after having read the book I know more and understand less about the pandemic – not because Lewis does a poor job of things, but because he does such a good job of it.

And anyway, as one of his reviewers commented (I’m paraphrasing), “If Michael Lewis published an 806-page book on the history of the toaster, I’d read it.”

* * *

Members of the KJR community know I generally hold up scientific inquiry as the gold standard for understanding how something works. But scientific inquiry has its limitations. Lewis provides an example in the CDC’s early response to the emerging pandemic.

The CDC, to its credit, bases its recommendations on science. But that limits its ability to carry out its mission: In the early stages of a pandemic, leaders have to make policy before there’s enough science to provide reliable guidance, just as military leaders sometimes have to plan for combat without good intelligence to guide them.

Lewis quotes Charity Dean, one of the book’s protagonists, who suggested the CDC is, as a result, mis-named: It should be called the Centers for Disease Observation and Reporting. By the time it had enough science to provide useful policy guidance the disease was already spreading according to the mathematics of compound interest.

* * *

Science is the gold standard for understanding how things work. That doesn’t make scientists the gold standard among human beings for objectivity and insight. The reason we do (or at least should) trust science is because it’s a self-correcting process designed to compensate for the all-too-human scientists who practice it.

Example (and thanks to “Robert B” for bringing it to our attention in the Comments last week): According to peer-reviewed research by John P A Ioannidis, the fatality rate among those infected by the virus is 0.23%. This is quite a lot lower than the reported U.S. fatality rate, which is 611,000 fatalities out of 34,600,000 cases – 1.8%.

So far as I can tell, the discrepancy arises from two causes. The first: Ioannidis based his 0.23% statistic on a worldwide “study of studies” methodology. His denominator is the presence of the virus in study subjects’ bloodstreams.

That’s in contrast to the 1.8% mortality rate. Its denominator is the number of (presumably) symptomatic cases reported in the U.S.

Neither mortality rate is wrong. Both are important pieces of information. Policy makers, and this includes private-sector Chief Risk Officers (CROs), need to understand these subtleties to do their jobs well. They need to factor in the levels of contagion and morbidity alongside rates of fatality.

They also need to recognize when any decision is better than no decision.

* * *

Each of us is our own CRO. We … every one of us … sets “policy” for ourselves in the form of decisions like when to wear masks, when to practice social distancing, and whether to be vaccinated. With less expertise than CROs can build into their organizations we’re more reliant on who we choose as our sources.

And that’s a tough call. Even if you ignore the political and media bloviators completely (recommended), the line separating the need for knowledgeable scientists to debunk quacks and propagandists from the temptation to vilify colleagues with whom they disagree is neither sharp nor bright. The case of Professor Ioannidis is, in this respect, instructive (see “The Ioannidis Affair: A Tale of Major Scientific Overreaction,” Shannon Bownlee and Jeanne Lenzer, Scientific American, 11/30/2020).

Whatever else you do, check your own source selection carefully. In the case of COVID-19, confirmation bias can be lethal.

Bob’s last word: Had we as a society treated the creation of the coronavirus vaccines as we did the Salk and Sabin polio vaccines, we would by now have achieved the herd immunity that would let us put this pandemic behind us.

As business leaders, as pointed out in last week’s column, we all have some ability to nudge society in the right direction.

Bob’s sales pitch: I’m on a roll with CIO.com. New this week: “11 dark secrets of application modernization.” Check it out.