Main content

History, horoscope, hands-on: making analytics matter


Peter Coffee Profile picture for user Peter Coffee February 24, 2015
Summary:
Peter Coffee, VP for Strategic Research at Salesforce, tracks the evolution of analytics, from horoscope to hands-on.

[caption id="attachment_6247" align="alignright" width="251"]Peter Coffee Peter Coffee[/caption]

I once heard an enterprise CIO say that at his company, no project was allowed to be named for the technology that it deployed – but only for the business goal that it achieved. There could be no project called “ERP upgrade,” for example, but it would be acceptable to call that project “Supply chain acceleration” – with an attendant shift in the focus toward the “why,” not the “what.”

I like that rule, because the name of a project often implies the definition of its success. Merely installing a new piece of kit in the data center should not be cause for medals and champagne, unless there’s tangible and significant benefit to the business that’s paying the bill. This brings us to “Business Analytics.”

There are any number of reasons to invest in analytics: cost reduction, sales effectiveness, quality improvement, customer retention, regulatory compliance…the list could go on at great length. As in the worst old days, however, of massive investment in “data processing” or “management information systems” or even today’s “information technology,” the sheer cost and complexity and skills-intensiveness of legacy analytics have often led to confusing the means with the end.

An ancient cartoon shows a suited-and-tied executive at his desk, ordering an assistant: “It does data processing, word processing and list processing. Get me some data, some words and some lists.”

Historically, the place to find things to analyze has been in the stacks of traditional by-products of doing business: in the accounting ledgers, the production records, the (increasingly automated) archival processes that once spewed forth multi-inch-thick stacks of pajama paper.

All too often, these reports bubbled up from the data-producing activity, rather than being drilled down from the decision-supporting question. It was 1970 when Avis CEO Robert Townsend warned of managers “drowning in ho-hum reports they’ve been conned into asking for and are ashamed to admit are of no value.”

This was the stage of analytics that we might call its History period, and many well-capitalized companies are still making money selling the tools of this kind of information archeology: better telescopes for looking more deeply into the rear-view mirror.

Reading the horoscope

When PCs proliferated on desktops, and data feeds started to become more democratically available, no one should have been surprised when people started to use that data (as many have said) “the way a drunk uses a lamp-post: for support, rather than illumination.”

Tools like the spreadsheet made it terrifyingly easy to goal-seek an outcome and determine what inputs would produce it, ignoring the resulting slide of every assumption toward the optimistic end of its “reasonable estimate” range. Distinguished and colorful industry observer Stan Kelly-Bootle wrote in 1995, “The PC soon blossomed as the Uzi of creative corporate accounting.

The What-If moved to Why-Not, indicting the spreadsheet as the chief culprit in the 1980s S&L scandal.”

We might call this the Horoscope period of business analytics, when adding complexity enabled people to fine-tune data in ways that made it seem to say whatever they wanted to believe.

Horoscope-pardaphash-80390
A hallmark of the Horoscope period was the time lag involved in asking follow-up questions. Whether by accident or intention, a selective presentation—frozen in the form of a chart or a report—creates a momentum that has been shown to have literally explosive consequences in incidents such as the Columbia disaster.

What’s needed, clearly, is a move toward mega-scale mobile agility in the second-question, third-question, fourth-question mode of exploration and insight: a growing necessity, in the face of floods of dynamic and irregular data bursting forth from a connected world.

The 'so what?' stage

What will not work is a brute-force approach to squeezing old models of analysis into new channels and onto new devices. Even if LTE-Advanced protocols do allow multi-Gbit/second connections to our smartphones, there will always be good reasons not to distribute—and thereby isolate, and eventually ambiguate—the data sets that most need to be coherent shared truth.

Replacing the mathematics of database query with the mathematics of indexed search, in particular, can bring the same once-astonishing scalability to data sets that Google was once remarkable for bringing to Web pages. Distributing the hard work of interactive visualization to use growing mobile-device processing power will be another key lever to move new data mountains.

Finally, we need to move beyond Kelly-Bootle’s progression of What-If, to Why-Not, to arrive at a next stage of “So what?”

Once an insight emerges, how quickly can it be reality-checked against multiple independent sources? How sharply can it be expressed as an incongruity, implying challenge or opportunity? How effectively can that be shared with those who can do something about it? How rapidly can the insight be transformed to action to secure a competitive advantage or avoid a misstep? This is what it means for analytics to become hands-on.

Not mere statement of facts, but answering of questions—and guidance of prompt and competitively differentiating actions—are the charge that must be given to anything that wants to be called next-generation business analytics.

 

Loading
A grey colored placeholder image