Don't do the wrong thing better
- Summary:
- Sometimes, doing the old thing morphs into doing the wrong thing. That's the point at which doing the old thing better is bad, says Salesforce's Peter Coffee.
Computers get faster and cheaper. Network bandwidth grows. Databases get bigger. Apps become more mobile-friendly. All of those statements are true, but they are also dangerously distracting – because building systems based directly on these obvious trends will build us better versions of what no longer makes sense, and of what people no longer demonstrate that they desire.
When I look back at my efforts, during the past year, to respond to various audiences’ requests for a vision of what’s to come, the most important theme that comes up again and again is “stop defaulting to doing the old things better.” Sometimes, doing the old thing morphs into doing the wrong thing: at that point, doing the old thing better is bad, a diversion of resources today; a shadowing of the opportunity and necessity to rethink basic models for success tomorrow.
Let’s start with the first assertion, that computers get faster and cheaper, usually mislabeled as Moore’s Law. Moore actually extrapolated transistor counts, not computational speed in any given mix of tasks: it’s been more than a decade since Greg Papadopolous observed that it had become a case of badly diminishing returns to throw all those cheapening transistors at making our spreadsheets recalculate more quickly.
Greg, who was then CTO of Sun Microsystems, delivered remarks at the October 2003 Microprocessor Forum (I was there) in which he noted that the step-up to Intel’s Pentium 4 had improved business benchmark performance by only 48% while increasing power consumption by 273%. Greg predicted that future transistor cost reductions would go into greater numbers of processor cores on a die, with each core on the order of complexity of an early-stage Pentium, as the most cost-effective way to get more stuff done – rather than devising increasingly elaborate computational engines for single tasks.
With even wrist-wearable devices now sporting four-core processors, the trend toward concurrent-core code is no longer in doubt – but it’s far from obvious that we have yet learned to teach a next generation of programmers, or give them the appropriate tools, to write the best possible code for such an environment.
What Greg did not emphasize in 2003, and what was quite widely under-predicted, was the rate at which pervasive wireless connections would make us far more interested in putting a processor in everything: in driving computation out to every edge and corner of the kingdom, so to speak, instead of building the castle towers taller.
Note that in the same month as Greg’s remarks at the Microprocessor Forum, the International Telecommunication Union’s official blog said we were in “what may be remembered as the year that Wi-Fi burst onto the wireless scene” – and it’s fun to read that blog post’s cautious comments that “not everyone is convinced that Wi-Fi is the next big thing.” No, today’s era of WiFi everywhere was not obvious to everyone, and LTE was still six years away from its first initial commercial introductions in Norway and Sweden.
Moore inside-out
From an engineering viewpoint, it really did (and still does) seem like the obvious thing to build fatter network pipes, to bring more data to massively Moore-enabled citadels of processing power. The more difficult thing, from a cognitive and an architectural point of view, is instead to disperse the computation: to use those networks, not so much for movement of static data toward the center, as for distribution of active knowledge toward the edge.
If we turn the Moore-driven model inside out, we’ll have better ways to make local decisions more quickly and more intelligently, in applications ranging from autonomous vehicles to assistive medical technology. We’ll be shifting from Moore’s model of “cheaper transistors are better” to a Metcalfe’s-Law model of “networks become more valuable in proportion to the square of their number of points of participation.”
If we have more connections, to more points of intelligence and not merely of dumb input/output, it can quite suddenly seem idiotic to build traditional databases that are ever more massive (and expensively defended) baskets of priceless eggs. Everything that’s wrong with highly centralized databases looks not so much challenging, as perverse, when we realize how quickly it’s become possible to construct highly dispersed and resilient and defense-in-breadth systems based on blockchain.
Thinking of blockchain as “that thing that Bitcoin uses to pay for your coffee” is about as dumb as calling the Internet “that thing Amazon uses to sell books by mail”: we’re talking about an opportunity to streamline and strengthen business processes, some of them notoriously complex and fragile (and opaque) today, and it’s attracting the commitment of everyone from Wall Street to Big Blue.
Not speculation, but active and startlingly successful testing, has already produced results described as “a 100% success rate” in “structured test cases to assess the network’s functionality, resiliency and data privacy”: a report this month asserted that “regulators could view in ‘real time’ a wide range of financial events including trade details, counterparty risk metrics, and exposure to reference entities.”
With computation dispersed, with networks transporting dynamic intelligence instead of just static petabytes, and with truth in processing too broadly shared to be vulnerable to any single attacker, we can start to think about “applications” as the experiences we enable – instead of just the paper forms that we digitize, or the simple workflows that we automate. Instead of going to an Uber app to ask for a car, we’ll start to see an Uber button in any context in which we might realize we need to go somewhere – or have a bot make us a contextually appropriate offer to get us there.
This is not your big brother’s app dev of wrapping a data silo in some business logic, decorated with a user interface. This is experience design, delivered with the person and the community as the focal entities, and with people’s devices becoming relevant only as dynamic portfolios of interacting views. It’s inside out, upside down, and immensely valuable to do.