Thirty years of the U.S. economy at a glance: 1973 — it blows up. 1983 — it finally starts to improve. 1989 — downturn. 1992 — it picks up again, until 2000 when it dot-bombs.

Here’s another view. In the early 1980s, personal computers invaded American business in defiance of centralized IT. In the early 1990s, central IT gained control of the PC again. Sometime around 1993, business discovered the Internet and for awhile it seemed that everyone in the company was coding HTML except central IT. It took awhile for IT to catch up, but finally, in the late 1990s, IT slowly gained a measure of control over eCommerce.

Look at the dates. Is it mere coincidence, or does the economy flourish in direct proportion to IT losing control over information technology?

I introduced the Value Prevention Society (VPS) this year to spotlight a pernicious attitude prevalent in IT circles these days: “We won’t do it for you and we won’t let you do it for yourself.” It’s the pure version of centralized IT control. Most of the correspondence I’ve received on the subject has been critical of my position, but not one proponent of total PC lockdown has offered an alternative to my Productive Flexibility policy that escapes the inevitable conclusion that VPS members prefer pencil and paper to the use of information technology not sanctioned by the IT organization.

Is that really a position you want to take in public?

Not that utter chaos is especially desirable. Working from a well-designed and managed architecture provides too many advantages to the enterprise. So how should IT gain control over complete chaos?

Answer: View limited chaos as your friend. In particular, look for patterns. If lots of sales reps have installed contact management software on their laptops, it means the company is ready, and perhaps overdue for a CRM-centered sales force automation system. If lots of users have installed software for faxing direct from PCs, install a fax server and make it even more convenient and useful. Do you see a lot of “shadow systems” that use data re-keyed from standard reports into Access or Excel? Run, don’t walk, to the Business Intelligence store and buy a BI tool for them to use instead.

Encourage and support end-user innovation. By participating, you’ll gain an inexpensive, highly reliable way to discover many of the lowest risk, highest impact opportunities in the business. It’s your best course of action.

Especially since the alternative is to wreck the entire U.S. economy.

Thomas Kuhn published The Structure of Scientific Revolutions in 1962, redefining the dialog about how science progresses, changing it from a purely philosophical prescription to incorporate the sociology of how real scientists actually behave.

In it, to his everlasting damnation, he introduced the phrase “paradigm shift,” which has since been tortured to insanity by a generation of management consultants who never bothered to actually read Kuhn’s seminal work.

It’s a shame, because the underlying idea — that major advances entail a complete change of perspective and worldview — is quite valuable. A paradigm shift doesn’t disprove old ways of looking at things. It makes them irrelevant.

For example: IT is in the midst of a paradigm shift that makes the old idea of requirements irrelevant.

“Requirements” is a holdover from a time when functional managers and end-users asked IT for a system. IT, responsible for delivery of working software that did something useful, asked the logical question, “What are your requirements?” What we got in response were a series of attributes. Software that possessed those attributes “fulfilled the requirements.” Whether it did something useful for the business wasn’t IT’s problem.

That isn’t how things work anymore, for which we should be eternally grateful. IT and functional managers now share responsibility for making sure business change happens. With this new scope, IT and representatives of every affected part of the enterprise must collaboratively redesign the target business function.

Defining requirements is irrelevant to this process. Instead, we must create a high-level “functional design” that describes new business processes, employee roles, and the technology that will be used by employees in their new roles to implement the new processes.

“High level” is a vague term. In practice, it means allowing no more than seven boxes in any diagram and no more than seven bullets of text in any narrative. When necessary for clarity, you can elaborate each box or bullet with one more level.

Once everyone agrees that this new design will work, you can drill down to as many more levels as necessary to create a detailed specification from which developers can code, testers can test, and trainers can train employees in their new roles.

It might seem that the difference between requirements and design is simply word play. It isn’t: Requirements are attributes; designs describe how things will work. This is far more than a semantic distinction.

It’s a different paradigm.