A mandate for trust-first design

Peter Coffee Profile picture for user Peter Coffee January 19, 2015
Summary:
Salesforce's Peter Coffee argues the design case for a rights-centric mantra of “establish permissions, then construct secure containers, and only then collect and seek value from data.”

[caption id="attachment_6247" align="alignright" width="251"]petercoffee Peter Coffee[/caption]

The roots of “economic” are Greek words for “house management” (oikos and nemein, respectively). To call a Web site “diginomica,” if anyone were going to do that, would therefore suggest that something is being managed: not the literal fingers or toes of the Greek root δάκτυλος (more or less, “dactylos”), but the kind of stuff that Nicholas Negroponte was talking about when he used the expression “Being Digital.” Which was…what?

Negroponte was not talking about computation, nor was he talking about connection. He was talking about the enabling revolution of representing every kind of information, and eventually every interaction, in a form that is countable; storable; transmissible; and transformable. Thanks to Norbert Wiener, we settled long ago on the provably cheapest way of doing this—in the form of bits—but then we started treating that foundation as out of sight, therefore out of mind.

Computers were initially slow and expensive, so it was hugely exciting when they became faster and cheaper; connections once were limited and intermittent, so it was even more exciting when connection became ubiquitous and continuous. Today, however, faster computers and more pervasive connections are not the most important enablers and encouragers of broader and richer use of new technology.

The greatest frictions today stem from people’s growing, and more than slightly justified, suspicion that these systems are getting better and better at doing the unintended and the destructive. In a survey of 1,600 consumers in the UK conducted late last year by KPMG, “70 percent suggest that with the marketplace flooded by inter-connected devices, it’s too easy for things to go wrong.”

The foundation of diginomics is the ability to speak of everything as a form of data – and yet, look at how poorly we treat that data as we build new “diginomic” systems. We send each other unencrypted emails, using protocols that invite the most casually informed malefactor to misrepresent origins and identities. We store data in what amount to uncovered boxes in public places, hoping that no one will think to look at what’s been left in plain sight.

A new approach

The way we build IT systems today is like building household electrical systems with bare copper wires, then adding the insulation later on. Trying to add the protections after we make the connections means we operate in a state of danger, initially, and face greater costs and complexities over the full life cycle than we would have endured if we’d built in the safety technology from the beginning; it also increases the chances of error or negligence, as an obscure corner of the system either escapes our notice or is just too much trouble to go in and finish in the endgame.

Trust
It’s one thing to be casual about the loss or leakage of data when we’re dealing with bits that represent money, or even bits that represent reproducible objects like a book or a pair of shoes. If someone spoofs your credit card, I can issue you a refund immediately even while still pursuing the miscreant. If someone spoofs your shipping address to divert the delivery of physical goods, I can re-ship another unit to you while tracing the original. These are correctable errors.

In contrast, the present and accelerating Cambrian explosion of new kinds of data—representing everything from how we drive to how our hearts beat—will inundate all manner of institutions and service providers with massive floods of data that exhibit the so-called “Streisand Effect”: “an attempt to hide, remove, or censor a piece of information has the unintended consequence of publicizing the information more widely.”

Today, information comes into the world like a newborn baby, naked and defenseless and ignorant of the threats that surround it. Think of a register in a microprocessor, or a telemetry frame being sent by a connected device, or a field in an unencrypted database that’s accessible by an unprotected URL.

We rely, far more than we realize, on sophisticated systems of encryption to provide even minimal protection against all manner of abuse or manipulation of the systems that enable modern life. As long ago as 1997, a distinguished group of infosec experts warned that we all invisibly rely on these technologies “to protect burglar alarms, cash machines, postal meters, and a variety of vending and ticketing systems from manipulation and fraud.” When politicians demonize encryption by associating it with terrorism, they do not recognize the disproportionate damage this would do to civil society while only slightly inconveniencing its enemies.

What’s needed is not timid retreat, but vigorous advance toward an expectation that a robust container and an auditable privilege management system will be part of the birthright of any new-born data asset. We can no longer tolerate a 1950s (or even 1990s) tech-centric mantra of “collect, then compute, then (perhaps) protect.”

We need a rights-centric mantra of “establish permissions, then construct secure containers, and only then collect and seek value from data.”

Anything less would be undiginomic.

More from Peter Coffee:

‘Ugly’ + ‘confusing’ doesn’t equal ‘smart’

Outcomes should outweigh activities

Tech of the future versus apps of the past

 

Loading
A grey colored placeholder image