What follows is dedicated to all of you who asked me to replace KJR with a political blog. As always, you should be careful what you ask for. – Bob

When I was growing up, antisemitism was a joke, and the experience most of us in my community had with actual antisemites was on a par with our real-world experiences with Big Foot and the Yeti. I did once hear someone use “Jew” as a verb – a stereotype that from time to time I wished was more accurate – and in my college years once overhead an inebriated patron in a local bar complaining about f***ing k*** lawyers.

But everyone in earshot was content to ignore him, and he eventually went away.

I was in high school when, in 1967, in response to Egypt closing the Straits of Tiran to Israeli shipping, Israel launched the so-called “6-Day War.” Most in my high-school community thought of Mideast policy in much the same way that we thought of the Chicago Cubs – a team to root for, even though we couldn’t articulate why. I had no idea what even a single Strait of Tiran was, but it didn’t matter. Israel was my team and I rooted for it.

My views on the subject have, I hope, become a bit more nuanced than that, and, on the grounds of my being Jewish, I’ve been asked about them. So here goes:

Where it started: WWII and the concentration camps, in which something like 12 million people were slaughtered, half of them Jews. One reason I stopped thinking in terms of MOTs (Members of Tribe) was how many of my fellow Jewish MOTs ignored or trivialized the 6 million or so non-Jews also murdered by the Nazis.

Regardless, public awareness of the camps led to a widespread perception that fair-is-fair: Jews deserved a homeland in which they could feel safe.

Which is why, in 1948, the Jewish residents in Palestine declared the founding of Israel as a modern sovereign entity, at which time, with no noticeable delay, the nations surrounding it launched an invasion with a goal of destroying it.

And it’s then that the historical record and assessments of cause and effect become confused. Some historians claim Israel expelled the Palestinians. Others assert that the Palestinians fled because they were urged to do so by the Arab leaders of the time.

What has been lost in the dueling narratives is that no matter the reason Palestinians left Israel, the nations they fled to – especially Lebanon, Syria, Jordan, and Egypt – settled them into refugee camps and radicalized them rather than welcoming them and providing assistance.

Nor did Israel do anything to encourage the refugees to return.

Which is why facile good-guys/bad-guys storytelling is of no value in thinking through what should happen next. Nothing can excuse Hamas’s recent invasion. Read about Hamas and it’s clear it doesn’t represent the Palestinian community. It has more in common with an organized crime syndicate than a political entity.

Read about Israel’s response to the invasion and a cynic might think it’s Netanyahu’s way of maintaining his leadership position, not of creating a just peace.

Read about the war and its contribution to resurgent antisemitism. It has underscored, in no uncertain terms, that just as is true of all other forms of bigotry, all antisemites needed to crawl out of the woodwork was an excuse.

Far from being the jokes I thought antisemites were when I was a youth, they were just as much MOTs as I was, just members of a different tribe.

And most of them understood that, back then, belonging to that tribe was socially unacceptable.

Do I have a solution? Not hardly. I do, however, have a notion, for all the good having a notion ever has. It’s for the entertainment industry to take The Blues Brothers as an exemplar: Create entertaining fare that ridicules bigots of all tribes and stripes.

Not the earnest, preachy fare that’s usually paraded in front of us to “raise our awareness.” Entertainment.

Because raising our consciousness asks us to acknowledge that our consciousness needs raising, and to be willing to expend cognitive effort – work – to raise it.

Entertainment, in contrast, is, by definition, fun.

Maybe fun enough to embarrass the MOTs who are, for some reason, proud of their idiocracy.

Bob’s sales pitch: In a more traditional KJR vein, I’m keynoting OSICON 2035 this coming Wednesday. It’s free. If you happen to be in Toledo you can catch it in person. Otherwise, it, along with the rest of the program, will be streamed.

Check it out!

Seventeen years ago (a prime number whether or not it was one of my prime insights) I defined “old” as spending more time disapproving of how someone else is living their life than enjoying how you’re living your own.

It complements another common definition: You’re old if you think everything about the past was superior to everything about the present.

This isn’t a definition I can cheerfully embrace, because it would mean acknowledging that the older KJR entries in my archives are better than the words you’re reading right this moment. I’ll leave you to make that comparison.

There is something that is demonstrably superior today to what it was when I was a youth: software.

In the 1960s, the “Software Crisis” was a thing. The software industry was encountering problems with software budgets, efficiency, quality problems, management, and delivery, to name a few of the thornier challenges (if you’re interested, check out “Software Engineering | Software Crisis” for more).

For many, the software crisis reached its apotheosis with NASA’s legendary missing hyphen, whose absence in the software controlling the mission resulted in ground control having to abort its 1962 Mariner I launch – a whopping mistake, costing $80 million in 1962 dollars.

In the 1970s and ‘80s, buffer overflow errors and intrusions were well-known platforms for malware and hacking exploits. Many of us in the industry wondered why operating systems couldn’t be built to stymie them.

Well, truth be told, many operating systems are still vulnerable, but actual buffer overflow exploits are far less common than they used to be.

Supposedly more-secure mainframe systems weren’t immune to design flaws, either. For example, it was common for remote users to inherit CICS access from users who had disconnected from the underlying VTAM communications link but hadn’t logged out from the application. Voila! Instant system access without even having to guess the password.

Many in the KJR community won’t even recall that it once was possible for bad application code to crash anything beyond the module it belonged in.

Sure, we still need to regression test, especially for platform glitches that might crash PROD. So far as application changes are concerned, yes, they can cause data problems – no small thing, but smaller than spreading joy randomly across the PROD environment.

IT has undergone a quiet revolution over the past couple of decades, because where “bug” used to mean “code that causes crashes,” it now means code that misunderstands how the business is supposed to run.

Or, just as accurately, it means code that reflects poorly-thought-out business processes and logic – “business bugs” if you will.

It’s an intriguing dichotomy. Where once upon a time the focus of IT software quality efforts was to isolate the technical damage done by bad code, now it’s to isolate the business impact of code that solves the wrong problems.

This, for IT, is what progress looks like.

Who or what deserves the credit for this progress? Mostly, it’s due to the high technical quality of enterprise application suites. Sure, when it comes to ERP, CRM, and Supply Chain Management systems there’s still plenty to gripe about. Having said that, a CIO from twenty years ago would never have believed the functionality available today for lease.

Also on the who-deserves-credit list are internal application developers and the development standards they bring to the party every day. Because when it comes to vulnerabilities, knowing what creates them and how to avoid doing so has stopped being a retrofit and is now standard practice.

Can you think of anyone or anything else that belongs on the list? Why not visit the Comments and share your thoughts about this.

Bob’s last word: This was a hard column to write, largely because of how much easier it is to be snide and sarcastic. I figured I’d try my hand at something complimentary and see what came of it. What do you think?

Bob’s sales pitch: Want to avoid “business bugs”? Dave Kaiser and I wrote There’s No Such Thing as an IT Project to give you a hand with this.

On CIO.com’s CIO Survival Guide:Why all IT talent should be irreplaceable.”

What it’s about: The conventional wisdom – that you should fire irreplaceable employees – is backward. Because if your employees aren’t irreplaceable, you’re doing something wrong.