Target Corporation recently renounced casual attire as an acceptable form of dress among its employees. While the policy change received wide attention, the company’s logic for doing so has not (the company did not comment on the subject to the press). Neither has the irony: The only clothing you can buy at Target is casual. Target recently sold Marshall Fields, where its employees can buy formal clothing, to the May Corporation.

Maybe it was to avoid a conflict of interest.

My personal opinion, supported by not much more than my own judgment and experience, is that when business executives have enough time on their hands to worry about whether some programmer is wearing a tie, or an accounting clerk’s shoes are open-toed, the company is badly overstaffed … in the executive ranks.

I’m not entirely alone, either: Another Minnesota-based company that’s enjoyed at least a modicum of success has taken the opposite tack: General Mills spokeswoman Mary Beth Thorsgaard was quoted as saying, “We tell people to dress for their day. Any questions about what is accepted should be directed to a manager, but people are getting the concept.”

It’s also worth noting that Microsoft, which some folks think has performed fairly well, does not require its employees to wear suits.

The Business Research Lab cites several benefits for moving to business casual, all messages from management to staff:

  • Flexibility on the part of management.
  • A willingness to do things the “new way.”
  • Management does not seek to “control” employees.
  • There is a system of promotion in place that does not favor those who have had the good fortune to be born in the more affluent classes.

It appears The Business Research Lab advocates a casual approach to parallel construction as well as business attire, but the points it makes are nonetheless well-taken.

The last is particularly noteworthy. Anyone can dress neatly, but formal business attire reinforces a caste system with two sources. First, those raised in affluence had good clothing in their childhood wardrobes and learned early to dress well. And second, the well-to-do, among them a company’s executives, can afford higher-quality suits.

It’s commonplace for business executives to complain that employees “naturally” resist change, ignoring the extent to which they resist change themselves. Resistance to casual attire in the workplace is a notable example. It’s hard to avoid wondering if one reason so many executives find this change disconcerting is that it symbolically reduces their elevated status.

The executive preference for necktie-wearing is surprising, too. More business executives are politically conservative than liberal. The French opposed the war in Iraq. Ever since, political conservatives have derided all things French, and the necktie is a French invention, so you’d think necktie-burning would have become a conservative cause celebre (if you’ll forgive my French).

Instead, many and perhaps most business executives have been grudging in their acceptance of this trend. Many, for example, discuss casual, sloppy, and revealing attire as if they were the same — possibly a symptom of simple confusion, but just as likely a rhetorical ploy intended to discredit the first by linking it with the second and third.

Certainly, revealing attire is distracting and unprofessional and has no place in most work environments (those where it is appropriate are outside the scope of this column). And while sloppy attire is generally unacceptable, there are exceptions. If you don’t think so, you haven’t come in on a weekend to help pull cable and move equipment.

Business casual is neither sloppy nor revealing. It’s just casual. Does it confer an advantage, a disadvantage, or some of each? The only hard evidence I’ve been able to find is that the longest economic expansion in history occurred as U.S. businesses relaxed their dress codes. Does that mean the move to business casual should get the credit? Of course not, but if casual dress reduces productivity, business productivity shouldn’t have increased, should it?

There have been studies. The evidence on either side is, however, shaky, as both sides of the debate rely, so far as I can tell, on nothing more than survey data. The result is neither more nor less reliable than the average bias of those surveyed.

In the absence of reliable evidence, my best advice is this: Be a leader, not a fashion consultant. Your job is to focus employees on what they’re supposed to accomplish, not on how they’re supposed to dress.

eXtreme programming is a shame.

Understand, there’s a lot to be said for it. I have nothing against it. Smart people have extolled its benefits in print, and in person. It undoubtedly works very well.

But it’s still a shame, because it was carefully packaged to scare the living daylights out of a typical CIO.

When you think of eXtreme programming, what comes to mind first? See? A CIO’s first thought is almost certainly, “Two programmers at one keyboard? There’s no way on earth I can afford to literally cut programmer productivity in half. What’s next on the agenda?”

Or, the CIO will hear the word “extreme” and immediately tune out everything else, because extreme means risk and risk means waiting until other companies make it mainstream.

But doubling up programmers is, while interesting, a nit. Here’s why eXtreme programming, or some other “adaptive methodology,” should be an easy sell:

If you ask business executives what IT does worst, the most common answer is probably project completion. Ask them what IT does best, and you hear about application maintenance and small enhancements — responsibilities most IT organizations address with great competence.

What adaptive methodologies have done is to turn big-bang application development into development by continuous enhancement. They start by building something small that works and adding to it until there’s something big that works. They play, that is, to IT’s greatest strength. That should make sense to even the most curmudgeonly of CIOs.

As with everything else on this planet, the great strength of adaptive methodologies is the cause of their biggest weaknesses, ones they also share with old-fashioned application enhancement.

The first is the risk of accidental architecture. To address this issue, adaptive methodologies rely heavily on “refactoring,” which sounds an awful lot like changing the plumbing after you’ve finished the building.

By beginning with a “functional design” effort that publishes an architectural view of the business as well as the overall technology plan you can reduce the need for refactoring. It’s also important to make sure the development effort starts with the components that constitute a logical architectural hub, as opposed to (for example) taping a list of the functional modules on a wall and throwing a dart at it.

The second risk is colliding requirements. With ongoing enhancements to more stable applications there’s a risk that this month’s enhancement is logically inconsistent with a different enhancement put into production three years ago. With adaptive methodologies, the time frame is closer to three weeks ago but the same potential exists: To a certain extent they replace up-front requirements and specifications with features-as-they-occur-to-someone. It’s efficient, but not a sure route to consistency.

How can you deal with colliding requirements? Once again, take a page from how you handle (or should be handling) system enhancements. In most situations, you’re better off bundling enhancements into scheduled releases than putting them into production one at a time. This gives you a fighting chance of spotting colliding requirements. As a fringe benefit it amortizes the cost of your change control process across a collection of enhancements. (Here’s an off-the-topic tip: If your developers like your change control process you need to improve your change control process. But I digress.)

The same principle applies to adaptive methodologies. As a very smart application development manager explained it to me, “My goal isn’t to have frequent releases. The business couldn’t handle that anyway. What I want is to have frequent releasable builds.”

Yeah, but who cares? As last week’s column argued so persuasively (how’s that for being humble?) most IT shops purchase and integrate, rarely developing internal applications, and integration methodologies aren’t the same as development methodologies. Are there adaptive integration methodologies?

It’s a good question for which the answer is still emerging. Right now, it’s “kinda.” The starting point is so obvious it’s barely worth printing: Implement big packages one module at a time. If the package isn’t organized into modules, buy a competing package that is.

Which leads to the question of which module to implement first. The wrong answer is to implement the module with the biggest business benefit. The right answer is to start with the application’s architectural hub. That will minimize the need to build ad hoc interfaces.

Taking these steps doesn’t make your integration methodology adaptive. The chunks are still too big for that.

But it’s a start.