Help! I’m desperate!

Not really. To be more accurate I’m minorly inconvenienced.

As mentioned a few months ago, I’m looking for an alternative to Quicken (“Plausibility rules,” 3/12/2018), because it deprecated a feature I rely on, presumably to force me to buy an upgrade.

Not to be bullied into an unwanted expenditure I’ve been on the hunt for an alternative. Thus far, with just one exception, every other personal finance package I’ve found is cloud-based.

Which leads to the question, WHAT????

Look, I’m an open-minded sort, so maybe I’m missing something. Yes, I realize my personal financial data is already in the cloud, assuming we’re all willing to redefine “cloud” to mean “on the web.”

But it’s scattered among a bunch of providers and accounts. If I use any of the non-Quicken personal finance management alternatives I’ve found so far, I’ll be putting it all in one place, just waiting for the next data breach to happen.

There is an exception — a package called GnuCash. I’d use it and be happy, except that the instructions for automatically downloading transactions into it are both impenetrable and, as far as I can tell, don’t … what’s the word I’m looking for? … work.

All of which puts me dead-center in the ongoing debate as to whether data stored behind your corporation’s firewalls are more secure than data stored in a SaaS provider’s data farms.

Now I’m far from an authority on the subject, but I do know what the correct answer to the question isn’t: Yes. I also know it isn’t No.

I know this because I know that in addition to all the well-known information-security basics, the accurate answer depends in part on whether you push your own information security failings onto your SaaS providers.

Here’s what I mean: If I decide to use a cloud-based personal financial management solution, and if I don’t change my password on a regular basis, properly protect myself from Trojans, phishing attacks, and keystroke loggers, and keep my OS properly patched and up to date, it won’t be the solution provider’s fault if someone borrows my data.

This all scales up to the enterprise: If you use, say, Salesforce.com and do a lousy job of key rotation, or your administrators share a super-user login, or you don’t conduct regular white-hat phishing attacks, or you don’t properly protect PCs from invasive keystroke loggers and all the other prevalent intrusion techniques, it really won’t matter what level of security excellence Salesforce.com has achieved.

Also, “secure” means more than “protected from intrusion and misuse. With Quicken (or GnuCash) I can easily backup my data to a backpack drive, knowing how I’d restore it if I need to.

With a cloud-based service provider I’m willing to take it on faith that they backup their customers’ data in case of some form of catastrophic failure. Recovering to the state just before my most recent transaction download, on the other hand, is something I strongly suspect isn’t part of the service.

For the enterprise equivalent, Salesforce.com is always the SaaS touchstone. It recommends customers make use of their own backup and recovery tools, or else rely on third-party services.

But of course, your own backup and recovery tools are exactly as vulnerable as anything else inside your firewall, while third-party alternatives add yet another potential point of security failure you can’t directly control.

KJR first mentioned the cloud more than ten years ago (“Carr-ied away,” 2/4/2008), and yet the cloud continues to perplex CIOs.

From business cases that are always either more nuanced than “the cloud saves money” or else are wrong … to an impact on application development that’s much more significant than “recompile your applications in the cloud and you’re done” … to COTS and SaaS-based application portfolios whose integration challenges put the lie to cloud nativity as the uniform goal of all IT architects … to the ever-harder-to-untangle questions surrounding cloud-level vs internal-firewall-based information security …

If you’re looking for simplicity inside all of this complexity, good luck with that. You’re unlikely to find it for the simplest of reasons: An organization’s applications portfolio and its integration are direct reflections of the complexity of the organization itself.

Modern businesses have a lot of moving parts, all of which interact with each other in complex ways. Inevitably this means the applications that support these moving parts are numerous and require significant integration.

Which in turn means it’s unlikely the underlying technology can be simple and uniform.

And yet, when I need an application that can automatically download transactions into a personal financial database, there’s a depressing uniformity of vision:

“Put it in the cloud.”

Sigh.

Three threads, one conclusion:

Thread #1: In a recent advertorial (“Stop Using Excel, Finance Chiefs Tell Staffs,” Tatyana Shumsky, 3/31/2018), The Wall Street Journal proved once again that, as someone once said, if you ignore the lessons of history you’re doomed to repeat the 7th grade.

Dan Bricklin first invented the electronic spreadsheet back in 1979. It was immediately and wildly popular, for some very simple reasons: It was incredibly versatile; you could use it to think something through by literally visualizing it; and, when IT responded as it usually does to requests for small solutions — not a good enough business case — users could ignore IT and solve their own problems, right now.

The Wall Street Journal’s story tells the usual tales of spreadsheets gone wild, with their high error rates and difficulties in consolidating information. What were those fools thinking, using Excel for <insert Excel-nightmare-case here>!?!

I was nowhere near the place and I can tell you exactly what they were thinking. They were thinking they had a job to do and the alternatives were (1) Excel, and (2) … uh, Excel.

The business case for the solutions extolled in The Wall Street Journal story was that the Excel-based solutions caused problems. Had users not solved their problems with Excel first, they’d still have no business case.

When Excel is the problem you can be sure the pre-Excel problem was much bigger.

Thread #2: One of my current consulting areas is application portfolio rationalization. It’s usually about enterprise applications that number in the hundreds, but sometimes clients want to consolidate desktop applications that, in large enterprises, easily number in the thousands, not including all of the applications masquerading as Excel spreadsheets.

It’s a shocking statistic, and a support nightmare!

Only it isn’t a shocking statistic at all. A typical Fortune 500 corporation might have 50,000 or more employees. With 50,000 employees, what are the odds there aren’t at least a couple of thousand different processes that might be improved through automation IT will never get around to?

It isn’t a support nightmare either. For the most part the applications in question are used by a dozen or fewer employees who are almost entirely self-supporting.

Support isn’t the problem. Lack of control is the problem. And, in highly regulated industries, lack of control is a real problem corporate compliance needs to solve. It needs to document not only that a given business function’s outputs are correct, but that its processes and supporting tools ensure they’re correct.

On top of which, information security needs to ensure applications with gaping holes are kept off the network, and that applications stay properly patched so that as new vulnerabilities are detected, new vulnerabilities are addressed.

All of this is certainly harder when each business function solves its own problems, but it’s hardly impossible.

And it’s much easier when IT is an active partner that helps business functions solve their own problems.

Thread #3: Once upon a time I was part of a team that redesigned our company’s CapEx governance process. We hit upon a novel idea: that our job wasn’t to prevent bad ideas from leaking through. It was to recognize good ideas and help them succeed.

It turned out we were on target. What we found was that bad ideas that needed screening out were few and far between. Good ideas explained badly? We saw plenty of those.

Tying the threads together: Large enterprises have lots of moving parts, which means small problems are real, worth solving, and too numerous for IT to handle on its own. Users engage in “rogue IT” to make their part of the business more effective, because they can and they should. IT ought to find a way to help their good ideas succeed instead of assuming they’re all pursuing bad ideas that have to be stopped.

The KJR solution: create a Certified Power User program (CPU — catchy, isn’t it?). Certified Power Users will understand the basics of normalized design so they can use MS Access instead of spreadsheets when they have a database problem to solve. They’ll know how to evaluate solutions professionally, so they don’t buy whatever looked flashy at a trade show. They’ll also know how to keep solutions patched, to minimize vulnerabilities.

And, they’ll keep an inventory of the small solutions they create and share it with IT.

In exchange, they’ll have administrative privileges for their PCs, and those of the users they support.

When you’re trying to persuade, “Let us help” is a more powerful message than “No you can’t.”