Information should be more 'force' than 'mass'

Peter Coffee Profile picture for user Peter Coffee November 11, 2015
Summary:
The question must become, not how much do we have, but how much can it do? Salesforce's Peter Coffee asks how information can be more a force than a mass?

[caption id="attachment_7884" align="alignright" width="300"]Peter Coffee Peter Coffee[/caption]

When theorists first broached the notion of information as something we could measure, many reacted as if they were trying to put a dimension on pure thought. Information had long been thought to be a state of mind; “to inform” was to alter that state. Yes, information had begun its intellectual life as a verb, centuries before it was a noun.

Today, one has to wonder if Norbert Wiener, Claude Shannon and John W. Tukey—the people who first swung the pendulum of “information theory” in its present direction—might wish that we could show some inclination to swing it back toward center. When we think of information mainly as an asset to amass, “more” seems obviously “better” – but this way lies an unproductive, unsustainable global accumulation of falsehood and noise.

The question must become, not how much do we have, but how much can it do? How much value can it create? How far can it move the world in a better direction? How can information be more a force than a mass?

The new currency?

Information avarice has infiltrated everyday conversation. Beginning with the 1948 coinage of the “bit,” followed by the 1960s’ whimsical extension to the “byte,” our obsession with mass quantities of information has reached the point that we don’t even need to specify what we’re measuring: in many situations, you can say “100 meg” and have everyone in the room know that you mean “100 Megabytes,” even when none of them is a computer scientist.

It’s amusing, almost, to realize that when you talk about old-fashioned money, you need to specify that you’re speaking in dollars or Euros or pounds – but data has become a universal currency, to the point that a label like “Bitcoin” is intuitively understood (even by people who have no idea how it works).

It’s startling to see Bitcoin’s esoteric enabling technology, the blockchain, as a subject of cover art on this month’s first issue of The Economist. It’s genuinely important to understand the accompanying special report on blockchain’s potential for redefining the semantics of digital trust – or is “digital trust” as dated a phrase as “digital camera”? When was the last time you personally handled, or even saw, a camera that used film?

Has the truth become defined, for all practical purposes, as that which is recorded in a reliable digital database? If so, at the present state of the art, the first adjective in “reliable digital database” is anything but redundant.

It’s Orwellian to think that it’s only true if a database says so, and it’s therefore important to note the most defining distinction of the blockchain model: that it is, by any practical meaning of the word, an incorruptible record. And it’s about time. We may be on the verge of seeing trustworthiness become an attribute of the data structure itself, instead of being effectively in escrow with whoever currently holds it.

The late Ronald Reagan has been widely quoted for his admonition “trust, but verify,” but many would be surprised to learn that Reagan was quoting a Russian proverb when he invoked that idea as part of arms-control agreement with what was then the Soviet Union. That virtuous circle of ideas may become manifest tomorrow in a global chain of consensus.

When we can more fully trust data, we can let it begin to take action. The Internet of Things could quickly become an overwhelming tsunami of zettabytes unless we can let algorithms transform that data into actions, so this is clearly a process that we need to get started – soon. But then again, that’s just the beginning. What ethical framework will inform those actions? (Ahem. Note the usage.)

If you didn’t know that there is already a journal entitled International Review of Information Ethics—I admit, I just now discovered it myself—then at least you have probably seen popular discussion of whether an autonomous vehicle should be able to decide to kill its driver, if that would save a larger number of other people. This is not an IT conversation, but it’s definitely a diginomic discussion.

What are you going to do with that?

Wiener and others were considered fringe thinkers, at the beginning, with their notions that adding information was the equivalent of subtracting chaos; that we could describe a piece of data as “informative” to the extent that it reduced our uncertainty about The Way Things Are or The Way Things Will Be.

Two decades later, Robert Townsend (then CEO at Avis Rent A Car) was considered an iconoclast when he talked about “Computers and Their Priests” – saying that a manager must be “ruthlessly hard-nosed about…‘What are you going to do with that report?’ ‘What would you do if you didn’t have it?’”

Four decades farther on, we’re looking at the possibility of being able to collect just about anything, from anywhere, at any time. There has never been more need for a discipline of “What customer experience are you going to create or improve?” “What customer value are you going to enhance?”

Those who force that issue now will write the rules, and win, at what could be transformative positive-sum games that are soon to begin.

Loading
A grey colored placeholder image