In a future of unknowables, digital emergence has barely begun

Peter Coffee Profile picture for user Peter Coffee June 29, 2021
Summary:
As life moves beyond the pandemic with increased freedoms and capabilities, Peter Coffee of Salesforce ponders on what the future holds for technology and the supply curve.

Question marks © Arek Socha - Pixabay

Heading into a period that most would agree we can call "post-pandemic", people and organizations will need to make big bets on three unknowables. First, to what degree will people return to pre-pandemic behaviors (for example, going on cruises and attending live events)? Second, to what degree will pandemic acceleration yield lasting change (for example, ordering on-line for curbside pickup rather than shopping in retail stores)?

Third and most interesting, though, is the unknown of what new behaviors will emerge when the pandemic's step changes (connection capacity for secure remote work, acceptance of contact tracing and other monitoring activities, connected-device capabilities to minimize risks for field service personnel) are compounded by suddenly increased freedom to go places and do things? What "emergent" behaviors are likely to result from this last category of changes, and what new business models (or creative destruction of old business models) may result?

When I try to share my thoughts on this situation, I have to deal with a growing challenge of my own. After roughly a third of a century of writing and speaking about tech, business, and public policy, I hate to repeat myself — but it's getting more difficult, year on year, to see something and say something about it without thinking, "Haven't I said that before?" It would be helpful if I could pull out a Men In Black neuralyzer and use it on myself, resetting my brain's loop counter to zero, hoping this would help me to focus on a context of now — to see, not what's familiar and too easy to think I recognize as "this again?", but what's newly emergent from combinations of established things plus new catalysts.

When I say "emergent", I mean something that's obvious only after it's happened; a behavior that would have been difficult to predict by merely understanding individual ingredients before they were mixed. You could study hydrogen and oxygen in depth, but totally fail to predict the dangerous and deadly properties (and useful applications) of dihyrogen oxide (better known as water). You could know a lot about the behavior of drifting molecules in the near-vacuum of space, but have no idea that those molecules would physically (rather than chemically) interact to give us sound waves when they were in a more crowded setting like a planet's atmosphere. "Sound? Where did that come from?" 

In a more complex but more familiar example, you could know a lot about cars, drivers, mobile telephones, and GPS satellites but still fail to predict that Uber and Lyft and uncountable similar services would emerge from their combination. We see now that anyone with access to a car, and knowledge of how to drive it, can almost instantly find paying passengers and take them places, aided by what used to require laborious mastery of a city's streets and enjoying what once would have been an impossibly god's-eye-view of its traffic. This is not a predictable combination of the known characteristics and behaviors of the components. It's emergent.

It might seem that our attack on this problem would require ever-deeper insight into possible interactions of anything with everything, but this strategy does not scale. In the "Hunted Earth" novels of science fiction author Roger MacBride Allen, there is a passing reference to a "Knowledge Crash" that's variously defined as (i) time to become usefully educated exceeding employable life span and (ii) cost of acquiring education exceeding resulting addition to lifetime income. We're not there yet — but with rules of thumb like "tuition rates will increase at about twice the general inflation rate", it is only a matter of time. 

Let's look in the opposite direction. As I noted here almost three years ago (see? I've said this before), there's a useful thought experiment about teaching a robot to deal with a complex environment: it yields the insight that superior "intelligence" may actually be founded on mastery of what to ignore, rather than stemming from an exceptional ability to notice more. "What is needed," observes Daniel Dennett, "is a system that genuinely ignores most of what it knows, and operates with a well-chosen portion of its knowledge at any moment."

Thinking, in the sense of applying logic and deduction and inference rules, turns out to be easy to express in code; knowing, in the sense of having enough understanding of an environment to make decisions even possible (let alone wise), is much more difficult; but doing, which depends on quickly (and apparently, unconsciously) ruling out most of the possible actions that one might take, is a research problem we have barely begun to take on.

"I'm hungry." "Have you tried unlocking the back door?" "No, how would that help?" "I don't know, but have you tried it?" If unlocking the back door means that the person who's trying to bring in the groceries from a shopping trip can now enter the house and make dinner, then there's an emergent thing here. How many more of these are waiting for us to find them? How do we prune the search? 

This brings us to the trigger for this whole train of thought, which was a recent comment in a business publication that "every company is now a technology company." I won't bother linking to the specific quotation, because something like this has been said by uncountable people over the past few years. (It's a crude measure, but Google search traffic including the phrase "tech company" in the target has tripled in just the past decade.) Is it accurate, and if so is it useful, to say that we're all tech companies today? Or might that be irrelevant - or even worse, misleading?

I ask this because the opposite of trying to master every imaginable interaction is to zoom out, rather than diving in deeper: to widen the view and see if there are a small number of potent principles, rather than an overwhelming number of possible low-level interactions, that might guide our strategies. When people assert something like "every company is now a technology company," one of my principles leads me to ask "How might that be wrong, and if so why might that be important?"

What is the defining characteristic of a "technology company"? I suggest that is not the use of technology, or even a dependence on devising distinctive technology to create competitive advantage. I suggest that it's a much higher-level principle, in the spirit of Peter Drucker's observation that innovation is "a term of economics rather than of technology." It's about the shape of the supply curve.

You've almost surely seen supply and demand curves in a basic course in economics — they're usually drawn to show the demand for something decreasing as its price goes up, while the supply on offer grows — and the intersection of the curves shows us the "market-clearing price." Application is subject, though, to both approximation and exception — the originator of these ideas, Alfred Marshall, warned in 1890 that "We cannot guess at all accurately how much of anything people would buy at prices very different from those which they are accustomed to pay for it." In other words, draw the curve with a fat marker rather than a sharp pencil, and don't obsess about its exact shape. Good to know.

As to the exceptions to these curves, some industries follow different rules. The marginal cost of a ton of steel or a barrel of oil goes up with increasing volume, as one is forced to rely on lower-grade materials or less convenient drilling locations. The cost of making one more copy of Microsoft Windows, though, is tiny and essentially constant — while the cost of offering one more Intel chip is determined by semiconductor math, dividing the cost of a chip fab by expected lifetime output, so that the average cost (not just the economy-of-scale marginal cost) per chip may logically go down with growing volume. Fast technology cycles, and flat or even downward-sloping supply curves, drive fundamentally different marketplace behaviors, as I noted in 2005. Darn it, can't I be original?

Consider another example of a flat supply curve. Does Uber have to worry about paying more per vehicle if it needs to ramp up capacity quickly? No, because its drivers supply the cars. Does it need to spend more on driver training as it goes deeper into the talent pool? No, because its drivers are licensed up front and app-supported ever onward. Does it pay a surge price for its use of roads at peak hours? You're kidding, right? How flat can a supply curve get? Is this how disruption happens, when flattening the supply curve becomes an opportunity that generates a strategy?

Not everyone gets to experience this. It's dangerous to preeningly and self-promotionally say, "Behold our modernity, we're a technology company!" Really? Ask WeWork what happened when they thought they weren't in the real estate business, for example.

Are you using technology to flatten your own supply curve, creating the kind of competitive advantage that changes the game? And are you thinking hard enough about the root causes, and the non-obvious mechanisms, of the combinations of changes to come — after the pandemic's digital accelerations have had their full effect? I'll be watching for your emergence.

Loading
A grey colored placeholder image