Maybe it’s because I think I should be willing to hire me.

Every so often, someone offers a syllogism along these lines:

Major premise: In trades like the law and medicine, you can’t practice without a formal certification. That’s important because lives are at stake, or if not lives, then important stuff of some kind. So you don’t want just anyone hanging out their shingle to cut open the unwary or defend them in court.

Minor premise: Programmers produce software that, depending on who they work for and what it’s supposed to do, also puts lives or other important stuff on the line.

Conclusion: Programmers should be required to earn a professional certification before they can write software for a living.

If we’d had to in 1980, I couldn’t have hired me. See, back in the day I taught myself to program in order to analyze a stack of data I’d collected from my electric fish research (no other management consultants have that on their resumes!). It wasn’t big data … there wasn’t such a thing back then … but it was big-enough data that I didn’t want to tally it by hand.

Then I discovered the painful truth: I’d never be more than a third-stringer in my chosen field (sociobiology) – a conclusion I reached after realizing how much smarter several second-stringers were than I was. With exactly one marketable skill, I became a professional programmer.

A few years later I was assigned the job of designing a system of some size and significance. (Warning: This retelling makes me look very, very insightful, and downright visionary. It is, I suppose, just barely possible the passing years have added a rosy glow to my memory. Anyway …)

Not knowing how to go about designing a big system, I did what any sociobiologist would have done in similar circumstances: I faked it. I had a lot of conversations with a lot of people about how they went about the work they did, and how the computer might help them do it better.

I mocked up what the screens might look like and fiddled with the designs until everyone agreed they could do their work a lot better with screens like these to support them.

That’s when my ignorance became public knowledge, the result of the head of App Dev asking me two questions I couldn’t answer: (1) What reports was the system supposed to generate? and (2) was the design I’d just published before or after I’d negotiated the feature set down from the users’ wish list?

I didn’t know how to answer the first question because while the head of App Dev figured the point of every information system was to print reports that provide useful information, I’d been designing a system to make work happen more effectively.

I couldn’t answer question #2 for the same reason. The various users and I had designed a system to manage and support workflows. Taking out features would have broken work streams. Negotiate features? I couldn’t even figure out what that conversation might be like.

The system ended up being the biggest successful development effort the IT department in question deployed in more than a decade, and not because it was the only attempt.

The point of this little tale isn’t to show you of how smart I am, although if you draw that conclusion I certainly won’t object.

The point is that back when I started programming professionally, certification would have meant learning the accepted wisdom regarding how to design information systems … accepted wisdom back then that we’ve since learned just doesn’t work.

Certifications in the 1980s would have prevented anyone from co-designing software and business processes. Certifications in the 1990s would have prevented the entire world wide web, and around 2000 they’d have prevented Agile.

Mandatory certifications are supposed to prevent bad things from happening. In IT, I’m pretty sure that while they probably would prevent some bad things from happening, they’d prevent a lot of positive developments at the same time.

See, while the law and medicine certainly have new developments all the time that require, not just a static certification but lifetime learning as well, both fields operate from a stable core set of practices that really do represent the minimum standard of basic professionalism.

We don’t have that in IT, at least not yet, at least not in a way that’s more than just a matter of personal preference. Even something as seemingly stable as normalized data design is proving, with the advent of NoSQL technologies, to be less broadly applicable than we thought even five short years ago.

So required certifications? In IT, at least, they’d be more likely to prevent progress than anything else.

Capitalism, Winston Churchill might have said but didn’t, is the worst system of economics … except for all the others.

Its virtues are celebrated whenever politicians speak and it has many, not least its ability to balance supply and demand, including demand nobody knew existed until someone invented a gadget to satisfy it.

On the other side of the ledger is its susceptibility to feedback loops, both positive, leading to bubbles, and negative, leading to economic depressions.

I just thought I’d share that cheery thought. Bubbles and depressions have nothing to do with this week’s topic.

What does have a lot to do with this week’s topic … certifications and what to do about them … is the corrupting influence capitalism has on so much of what it touches, the desire for wealth being the root and all that.

Take any truly meaningful certification and you can bet those responsible for maintaining its integrity are either insulated from the economic impact of the certification or benefit from keeping the certification restrictive.

Start with something simple — grades. Once upon a time I taught an IT-related topic in a local university’s graduate program. I awarded A’s to those students who excelled, B’s to those who did well, C’s to those who achieved a basic level of understanding, and D’s to everyone worse.

The Dean asked me to change my grading. Why? Most of the program’s students were eligible for tuition reimbursement from their employers, but only if they maintained a B average. The school’s revenue depended on lax standards.

Now, a related development threatens to make earning a college diploma a deeply meaningless achievement. Increasingly, graduation rates are considered an important measure of a school’s worthiness.

KJR hasn’t delved into the seriously dull subject of metrics for some time, so as a reminder, there are four metrics fallacies: Measuring the right things wrong; measuring the wrong things, right or wrong; failing to measure something important; and extending measures to those with a personal stake in them.

This one’s easy. Once colleges and universities are assessed based on graduation rates, they’ll have three obvious courses of action: Make admissions more selective; educate students better; or A’s for everyone!

Which looks easiest and most certain to you? Yup, and easy-and-certain is a good predictor of future behavior.

The target graduation rate for colleges and universities seems to be around 90%. For contrast, the U.S. Air Force Academy only passes 75% or so, which makes sense once you figure whoever passes might be the one you rely on to shoot down the enemy plane shooting at you.

Then there’s the Bar examination. Rates vary widely by state, from 41% in Louisiana and California to New Mexico’s 85%. The lawyers who make up state Bar Associations benefit from restriction, as everyone who passes increases competition in an already overpopulated field, and everyone who passes without being fully qualified further discredits a field with a poor reputation.

Compare that to the MCSE. Oh, wait, you can’t, because that little statistic doesn’t seem to be available. My guess: Those who teach Microsoft technology benefit from high pass rates. Microsoft benefits by having a large population of IT professionals certified in its technology.

But both rely on a perception that receiving the certification is a difficult achievement (for all I know it is — I’ve never tried to earn one.) Publishing pass rates would let everyone know where test administrators draw the line.

Not to pick on Microsoft or the MCSE — it’s mentioned here because it’s well-known, not because it’s better or worse than any other vendor-managed certification.

Quite a few respondents challenged last week’s contention that certifying bodies should ban the use of certifications as a qualification for job applicants. And I have to admit, that’s probably too extreme: There are plenty of fields where the certifications are downright reassuring, from medicine to the law to construction to beautician (okay, I admit, I don’t actually get that one).

So I’ll back off a bit. Instead, before you use a certification to screen applicants for a position, look carefully at the incentives associated with its administration. A shortcut: Whatever else, certifications administered by independent non-profit associations are more likely to be focused on protecting the integrity of a profession than those administered by for-profit vendors.

And if your plan is to receive a certification, including a college diploma, some related advice: Given a choice between studying for the exam and for gaining actual knowledge and ability, go for the knowledge and ability.

Both will help you get certified. But only one will help you in the job once you get in the door.