Maybe it’s because I think I should be willing to hire me.
Every so often, someone offers a syllogism along these lines:
Major premise: In trades like the law and medicine, you can’t practice without a formal certification. That’s important because lives are at stake, or if not lives, then important stuff of some kind. So you don’t want just anyone hanging out their shingle to cut open the unwary or defend them in court.
Minor premise: Programmers produce software that, depending on who they work for and what it’s supposed to do, also puts lives or other important stuff on the line.
Conclusion: Programmers should be required to earn a professional certification before they can write software for a living.
If we’d had to in 1980, I couldn’t have hired me. See, back in the day I taught myself to program in order to analyze a stack of data I’d collected from my electric fish research (no other management consultants have that on their resumes!). It wasn’t big data … there wasn’t such a thing back then … but it was big-enough data that I didn’t want to tally it by hand.
Then I discovered the painful truth: I’d never be more than a third-stringer in my chosen field (sociobiology) – a conclusion I reached after realizing how much smarter several second-stringers were than I was. With exactly one marketable skill, I became a professional programmer.
A few years later I was assigned the job of designing a system of some size and significance. (Warning: This retelling makes me look very, very insightful, and downright visionary. It is, I suppose, just barely possible the passing years have added a rosy glow to my memory. Anyway …)
Not knowing how to go about designing a big system, I did what any sociobiologist would have done in similar circumstances: I faked it. I had a lot of conversations with a lot of people about how they went about the work they did, and how the computer might help them do it better.
I mocked up what the screens might look like and fiddled with the designs until everyone agreed they could do their work a lot better with screens like these to support them.
That’s when my ignorance became public knowledge, the result of the head of App Dev asking me two questions I couldn’t answer: (1) What reports was the system supposed to generate? and (2) was the design I’d just published before or after I’d negotiated the feature set down from the users’ wish list?
I didn’t know how to answer the first question because while the head of App Dev figured the point of every information system was to print reports that provide useful information, I’d been designing a system to make work happen more effectively.
I couldn’t answer question #2 for the same reason. The various users and I had designed a system to manage and support workflows. Taking out features would have broken work streams. Negotiate features? I couldn’t even figure out what that conversation might be like.
The system ended up being the biggest successful development effort the IT department in question deployed in more than a decade, and not because it was the only attempt.
The point of this little tale isn’t to show you of how smart I am, although if you draw that conclusion I certainly won’t object.
The point is that back when I started programming professionally, certification would have meant learning the accepted wisdom regarding how to design information systems … accepted wisdom back then that we’ve since learned just doesn’t work.
Certifications in the 1980s would have prevented anyone from co-designing software and business processes. Certifications in the 1990s would have prevented the entire world wide web, and around 2000 they’d have prevented Agile.
Mandatory certifications are supposed to prevent bad things from happening. In IT, I’m pretty sure that while they probably would prevent some bad things from happening, they’d prevent a lot of positive developments at the same time.
See, while the law and medicine certainly have new developments all the time that require, not just a static certification but lifetime learning as well, both fields operate from a stable core set of practices that really do represent the minimum standard of basic professionalism.
We don’t have that in IT, at least not yet, at least not in a way that’s more than just a matter of personal preference. Even something as seemingly stable as normalized data design is proving, with the advent of NoSQL technologies, to be less broadly applicable than we thought even five short years ago.
So required certifications? In IT, at least, they’d be more likely to prevent progress than anything else.