Maybe it’s because I think I should be willing to hire me.

Every so often, someone offers a syllogism along these lines:

Major premise: In trades like the law and medicine, you can’t practice without a formal certification. That’s important because lives are at stake, or if not lives, then important stuff of some kind. So you don’t want just anyone hanging out their shingle to cut open the unwary or defend them in court.

Minor premise: Programmers produce software that, depending on who they work for and what it’s supposed to do, also puts lives or other important stuff on the line.

Conclusion: Programmers should be required to earn a professional certification before they can write software for a living.

If we’d had to in 1980, I couldn’t have hired me. See, back in the day I taught myself to program in order to analyze a stack of data I’d collected from my electric fish research (no other management consultants have that on their resumes!). It wasn’t big data … there wasn’t such a thing back then … but it was big-enough data that I didn’t want to tally it by hand.

Then I discovered the painful truth: I’d never be more than a third-stringer in my chosen field (sociobiology) – a conclusion I reached after realizing how much smarter several second-stringers were than I was. With exactly one marketable skill, I became a professional programmer.

A few years later I was assigned the job of designing a system of some size and significance. (Warning: This retelling makes me look very, very insightful, and downright visionary. It is, I suppose, just barely possible the passing years have added a rosy glow to my memory. Anyway …)

Not knowing how to go about designing a big system, I did what any sociobiologist would have done in similar circumstances: I faked it. I had a lot of conversations with a lot of people about how they went about the work they did, and how the computer might help them do it better.

I mocked up what the screens might look like and fiddled with the designs until everyone agreed they could do their work a lot better with screens like these to support them.

That’s when my ignorance became public knowledge, the result of the head of App Dev asking me two questions I couldn’t answer: (1) What reports was the system supposed to generate? and (2) was the design I’d just published before or after I’d negotiated the feature set down from the users’ wish list?

I didn’t know how to answer the first question because while the head of App Dev figured the point of every information system was to print reports that provide useful information, I’d been designing a system to make work happen more effectively.

I couldn’t answer question #2 for the same reason. The various users and I had designed a system to manage and support workflows. Taking out features would have broken work streams. Negotiate features? I couldn’t even figure out what that conversation might be like.

The system ended up being the biggest successful development effort the IT department in question deployed in more than a decade, and not because it was the only attempt.

The point of this little tale isn’t to show you of how smart I am, although if you draw that conclusion I certainly won’t object.

The point is that back when I started programming professionally, certification would have meant learning the accepted wisdom regarding how to design information systems … accepted wisdom back then that we’ve since learned just doesn’t work.

Certifications in the 1980s would have prevented anyone from co-designing software and business processes. Certifications in the 1990s would have prevented the entire world wide web, and around 2000 they’d have prevented Agile.

Mandatory certifications are supposed to prevent bad things from happening. In IT, I’m pretty sure that while they probably would prevent some bad things from happening, they’d prevent a lot of positive developments at the same time.

See, while the law and medicine certainly have new developments all the time that require, not just a static certification but lifetime learning as well, both fields operate from a stable core set of practices that really do represent the minimum standard of basic professionalism.

We don’t have that in IT, at least not yet, at least not in a way that’s more than just a matter of personal preference. Even something as seemingly stable as normalized data design is proving, with the advent of NoSQL technologies, to be less broadly applicable than we thought even five short years ago.

So required certifications? In IT, at least, they’d be more likely to prevent progress than anything else.

Your taxes are due.

It’s that time of year again — time to reflect that many business executives have the same attitude about IT that the Tea Party has about the federal government: They “know” they spend too much for it and aren’t at all clear what they’re getting for their money.

Which has little to do with this week’s pair of IT critical success factors, but a lot to do with IT CSFs #1 & 2. The relationship matters more than anything else.

This week’s factors can certainly influence the quality of the business/IT relationship, though, because they’re all about IT’s credibility, which lives and dies on how well it delivers the goods.

“The goods,” of course are applications that (1) do what the business needs, (2) don’t cost too much, and (3) show up on time.

Above and beyond everything else required to deliver the goods are a company-wide project management culture, and a well-practiced systems development lifecycle.

Project management culture

Project management is a pain in the corporate keister. In a construction project the project manager doesn’t draw the blueprints, pour the concrete, or weld the girders. In a software project the project manager doesn’t design the software, write the code, or test the results. Project management is overhead.

It’s annoying, too. Most people, most of the time, put planning right next to dental work, at the top of their lists of things they seriously don’t want to do. But project managers insist on it anyway, and even worse have the bad taste to do it in public: They insist other people review and approve their plans. Is this how their mothers raised them?

We aren’t done. Not only is project management annoying, but project managers are annoying. Most people, most of the time, follow the MT methodology, which stands for “muddling through.” Project managers, though, can’t accept muddling through because they need to coordinate the actions of multiple muddlers.

So on top of everything else, they nag.

Net net: There’s nothing about project management that’s naturally likable. And yet, even the best project managers can’t succeed unless everyone whose activities they have to coordinate let them. This means that, just as was the case last week when the subject was process, project management can’t succeed unless the whole company, inside and outside IT, has a culture of project management.

Unless, that is, all those annoying things that have to happen for projects to finish on time, within their original budget, with all deliverables intact … and, by the way, resulting in the intended business change happening like it was supposed to … where was I? Oh, yeah, that’s right. Unless everything about project management is How We Do Things Around Here.

Is this important? Yes. Were we to construct a list of general-purpose corporate critical success factors, there’s little doubt that excellence in project management would make the list, and would probably have a high position on it. That’s because projects are how change happens in an organization, and if there’s ever been a tired cliché more true than “the only constant is change,” I don’t know what it is.

Well-practiced SDLC

There is, as you’re thoroughly tired of reading by now, no such thing as an IT project . There are, though, lots and lots of projects that include a need for information technology. IT is rarely if ever sufficient, but it’s usually quite necessary.

Which means IT needs to be very good at designing, developing, testing, and rolling out new applications. Or (or and) it needs to be very good at selecting, configuring, and integrating commercial, off-the-shelf (COTS) solutions.

SDLC stands for “systems development lifecycle.” The phrase is ingrained, so I’m gritting my teeth and using it here, even though most IT organizations implement COTS solutions more often than they develop from scratch.

IT needs to be good at this, and the only way to get good at anything is to practice it.

Yes, there’s lots of controversy over whether SDLCs should be waterfall or Agile. And within the Agile community there’s lots of controversy over whether it should be Kanban, Scrum, eXtreme, or conference room pilot. (Actually, CRP rarely generates controversy because even though it ought to be the most-widely used Agile variant, very few people have even heard of it).

But in the end, which SDLC you choose matters less than sticking with it and getting good at it. If the point isn’t clear, turn it around: How good can anyone be at anything if they only do each thing once, for the first time?