Before there was Equifax there was British Petroleum. Before British Petroleum there was Enron.

All three were responsible for disasters. And, all three are evidence of something every who leader needs to embrace:

It’s always the culture.

Sure, skills and experience, tools and technologies, and processes and procedures matter too. For example: Just as you thought it couldn’t be any worse comes the revelation that in Equifax Argentina, an internal system that provided access to customer records had a backdoor, where both login ID and password were “admin.”

Proper security policies and procedures would have prevented this.

Just kidding.

For all I know Equifax Argentina’s security policies and procedures are just fine and dandy. If they’re out of step with the corporate culture they wouldn’t have made any difference. Culture wins every time.

Call it Lewis’s Law of Unnatural Disasters: When something goes terribly wrong you can bet there’s something about the organization’s culture that makes terribly wrong inevitable.

But in engineering your organization’s culture … and yes, culture is something to engineer … you need to consider your chosen solution’s ripple effects for the culture to be a positive force.

Let’s hypothesize that Equifax Argentina does have security P&Ps that specify what constitutes a suitably secure password — that the fault was a culture that resulted in nobody giving a damn. What cultural trait should its leadership be encouraging to prevent a recurrence?

The obvious one is a culture shaped so the employee handbook is law and everyone obeys it. That should do the trick.

It would. It would also create a culture where jailhouse lawyers are on a constant quest for loopholes that can only be closed by increasing the length of the P&Ps. Eventually, all your employees would need a year of study just to learn what’s in the handbook.

Beyond that, it would lead to a culture where checking off the boxes is what matters, not accomplishing the desired outcomes.

Worst of all it would result in a culture that combines blind obedience with a complete absence of risk-taking and initiative.

Compare that to a culture that focuses more on outcomes than obedience. Culture is loosely defined as “how we do things around here.” The cultural trait We don’t put people at risk” wouldn’t just eliminate the admin/admin login/password combo, whoever put it in place would suffer a fate worse than being fired.

They’d be shunned.

But there’s a complication in all of this that isn’t easily addressed.

Enron’s CEO and board chair, Jeffrey Skilling and Kenneth Lay pleaded the ignorance defense — yes, Enron the corporation was doing awful things, but they didn’t know about them. After Deepwater Horizon exploded, BP’s CEO Tony Hayward expressed a similar level of know nothing-ism.

Equifax’s executives haven’t yet pleaded ignorance, but it’s only a matter of time.

Which gets to the complication: They probably were ignorant, and in some important respects they should have been.

The best leaders don’t find ways to succeed. They build organizations that find ways to succeed. They can’t do this without delegating. They can’t do this unless the people they delegate to delegate.

In great organizations, employees at all levels have authority and take responsibility, to degrees that are surprising to those managers who consider any decision not made by themselves or someone higher up the chain of command to be an unacceptable risk.

Or as D. Michael Abrashoff, former Captain of the Benfold and author of “It’s Your Ship” put it, “I chose my line in the sand. Whenever the consequences of a decision had the potential to kill or injure someone, waste tax-payers’ money, or damage the ship, I had to be consulted. Sailors and more junior officers were encouraged to make decisions and take action so long as they stayed on the right side of that line.”

Sounds great. It is great. Only if someone on board the Benfold had done something reckless with Deepwater-Horizon-scale consequences, Captain Abrashoff very likely would have been ignorant, because that’s the whole point: The people in charge not making themselves decision bottlenecks.

Culture is certainly the first line of defense. But those pesky human beings being what they are, it isn’t a perfect, airtight solution.

Leaders also need metrics, controls, and governance mechanisms, to provide the guardrails that backstop culture’s lane markers.

But even with these, culture comes first because with the wrong culture, employees will find ways to jigger the metrics, fake out the controls, and game the governance.

What they won’t do without a culture that encourages it is take the risk of telling you something that should be happening isn’t, or that something that shouldn’t be happening is.

It’s always the culture.

We consultants have an easy life. For the most part our techniques are uncomplicated and our advice is, while good, pretty obvious. Even better, most clients don’t want our advice. They either want us to read a script, or they have a dozen reasons our advice is good in theory, but won’t work in the “real world.”

Personally, most of what I do is Undercover Boss except I’m not the boss. In my experience, employees know exactly what’s wrong with the organization, have a pretty good idea how to fix it, and have an accurate bead on why management will never make the repairs.

In the case of information security, it’s usually even easier than that: If companies would just:

> Patch: Now, please.

> Encrypt everything: Too expensive? Net the cost of the time needed to decide what should be encrypted and what doesn’t need to be against the cost of encryption. Encrypting everything costs less.

> Rotate keys: Rotate them at least as often as users are required to change their passwords because the data in your corporate databases is more sensitive than the data in individual laptops. What would you do without me?

> Phish: Subject everyone in the company to white hat phishing attacks. Everyone. Frequently. Model your attacks on real-world ones. Explain to employees who click what they fell for and how to spot the next one. Because the bad guys don’t bother trying to crack passwords any more. They just ask for them.

One more: Add “Don’t store this because we don’t need it and never will” to your company’s master data management practices. I spent much of my spare time over the past week trying to figure out what uses EquiFax might have for storing social security numbers in its credit records, and I’ve come up dry. My social security number has no bearing on my creditworthiness.

With this exception: It’s the only form of personal identification that won’t change over time.

The “never will” qualifier deserves a bit of explanation. I worked with a life insurance company once upon a time that routinely deleted a lot of information about applicants once they became policy holders because they didn’t need it anymore.

Until the time, a few decades later when the importance of customer analytics was becoming apparent.

So “never will” is a balancing act.

Which gets us to: In response to last week’s column proposing SSN 2.0, several correspondents and Commenters pointed out that when we who till the soil of corporate IT need to determine if someone should be allowed into a system, we establish a key value … the user ID … and one or two authenticators, of which passwords are the most prominent.

Social security numbers play both roles — they’re both identifier and authenticator, on the theory that only the holder of a social security number knows what it is.

It’s a quaint perspective, but seriously folks, haven’t we become just a wee bit more sophisticated in the 81 years since the Social Security administration issued its first batch of cards?

Not to mention since Woolworth became the first and possibly worst identity thief of all time? (You just have to read about this — click here.)

In an interesting way what we’re looking at is really a common IT problem: A system that elegantly solves a problem is expanded to solve additional related problems. Then it’s expanded again. And with every expansion the system’s architecture becomes another notch messier, until it reaches the point where it’s at risk of collapsing under its own weight.

When the subject is business applications this means it’s time for modernization, conversion, or a re-write, to a system designed from the beginning to handle the actual scope of the solution.

Here, the original problem was to uniquely identify citizens registered with the Social Security Administration, to which the IRS added taxpayer identification.

Now, the SSN is used by businesses asking the question, “Can we trust this person to hold up their end of the bargain when we sign a mutually binding contract?” It’s the public connecting point for all of a person’s financial records.

Whether my semi-whimsical SSN 2.0 proposal bears any resemblance to what a real solution would look like is anyone’s guess. What I am pretty sure of is that, if your company stores consumer information and doesn’t follow at least the practices described here and last week (no, not “best practices” — call them “barely adequate practices”), it will end up contributing to the problem.