It wasn’t much fun in ’21, but there’s plenty to do in ’22
- Peter Coffee of Salesforce digs into the roots of technology culture norms, and looks at the potential for change this year.
What's strange about the word "technology" is that it's an "ology" that people seem to think is about the objects that people make – when it's far more interesting as the study of the behavior that leads people (and groups of people) to make those things.
Think about it: if I showed you a picture of a pile of rocks, would you say "Oh, that's a geology"? If you visited a zoo or a botanical garden, would you say you had spent the day "at the biology"? And yet, people will call a microchip or a space-based telescope examples of "technology" – when really, the objects only become interesting when they enable (and to some extent, induce) behavior change.
Commit non-random acts of technology
There are two kinds of behavior in question: we might loosely classify them as causes and effects. First, there's the behavior that makes up what I'll call "acts of technology." This follows from what I wrote here on diginomica almost four years ago, when I wanted to explore our obligation (as "techies") to pass along the "maker" capability – and inclination – to a next generation. (I'm using that word "maker," sometimes seen in the phrase "maker culture," in the sense of Tim O'Reilly and Dale Dougherty's Make magazine – first published in 2004, and called "a central organ of the maker movement" by The Economist in 2011.)
Is it easier today to make things? Certainly, there are now many more people who can afford to acquire or rent a 3D printer, CNC machine, or other formerly industrial equipment – now both cheaper, and scaled down to backyard shed (or even desktop) size.
There would be no "culture," though, and therefore little basis or scope for an "ology," if we did not also see an emerging and thriving community of behavior around those machines (as well as others like the Arduino micro-controller). "If you are not sharing your designs, you are doing it wrong," The Economist shared in a quotation attributed to MakerBot's then-CEO, Bre Pettis.
Sharing, it must be clearly noted, does not always mean giving it away: "Arduino lets other firms copy its designs, for example, but charges them to use its logo," The Economist observes. Where does this lead? That question introduces a second set of behaviors, the effects of such acts of technology.
Pandemics and platforms and processes
Industrial economies used to tend toward a monolithic design: ownership of patents attracted capital investment for factories, leading to concentration of market share among a small group of producers with high barriers to entry. What we're seeing now is a massive decentralization, where capital barriers are low; much of the intellectual property is non-proprietary; and well-defined interfaces, both hardware and software, encourage platform economies in which many players can each develop distinctive ways of adding value to each other's work.
These things are amplified by occurring as we enter another pandemic year: "I didn't realize 2020 was gonna be a trilogy," someone observed on Twitter as last year was ending. As people move beyond short-term accommodations, and start to think about enduring changes, we should notice that many of the widely noted attributes of "next normal" – reduced inclination to come to an office to work, greater readiness to explore alternatives to traditional employment, reduced reliance on traditional credentials and degrees – are aligned with "maker culture" capabilities and interests.
Please don't think I'm saying that this is an inevitable process. This does not just happen, merely because it can. It's a set of behaviors that needs to be nurtured, not merely enabled. The inclination to make is a precious thing, and it is not being introduced and energized as well as it might – and I get triggered when I see this challenge going unrecognized.
I'm not a fan, for example, of the phrase "digital natives" when it's used to suggest that people who grew up with the Internet are ipso facto better equipped than their elders to grasp and grow what's yet to come. It seems to me that there is an obvious counterargument: that those "natives" grew up with an opportunity to consume, without needing to participate in constructing, the mechanisms and the content of the technology that's deeply embedded in their daily lives.
When the machine wakes up
The simplest example is the very first encounter that a person has with a personal computing device: Apple II and first-generation IBM PC machines booted directly to a BASIC-language interpreter, effectively waking up to ask, "What do you want to program me to do?" Today's machines, if they can be said to have a wakeup question, ask the user something more like "What do you want me to show you?" or "Whose ideas do you want me to download and demonstrate?"
It is utterly amazing that there are web sites like WolframAlpha that can answer a question like "What are the odds of getting a full house in poker?" – showing you what they think that question means, as well as more than one version of a solution. At the same time, though, it's a challenge to educators to figure out how to develop the mental muscles needed to answer the next generation of questions, for example those yet to be properly asked (let alone answered) about how to undertake computations on quantum processors.
An "ology" is a "doctrine, theory, or science" – but people seem to be noticing, lately, that we have been letting technology's outputs happen to us, instead of coming to thoughtful and equitable agreements on how we want them to serve us. We won't change that situation by following paths of least resistance that lead to bigger, faster, or even more "intelligent" or "quantum" versions of the systems and processes that we're used to thinking are the ways that stuff gets done.
What kinds of change are we talking about? Stewart Butterfield, writing in 2013 as the co-founder of a startup called Tiny Speck – which became Slack, which became bigger, which got bought by my own employer Salesforce last year – described that challenge to his team, saying, "We are asking a lot from our customers. We are asking them to spend hours a day in a new and unfamiliar application, to give up on years or even decades of experience using email for work communication (and abandon all kinds of ad hoc workflows)…to switch to a model of communication which defaults to public; it is an almost impossibly large ask. Almost." If we're not making "almost impossibly large asks" of ourselves and our teams, how is anything amazing supposed to happen?
Let's make something of this
As we enter 2022, I invite the readers of diginomica to dig into the roots of "technology" and commit intentional acts that enable change – and then to take on the far greater challenge of asking for, and achieving, that change for themselves, their organizations, their customers and their communities.