Readers know I've been perpetually disappointed by the lack of creativity in virtual events. As a rule, the bigger the event, the more passive/impersonal the so-called "experience."
I wasn't exactly looking forward to Web Summit 2020 - about as massive a tech event as you can find. But the organizers proved me wrong (Web Summit already has their Lisbon 2021 site up for next November).
Web Summit managed to pull off a virtual event at scale, while creating small niches and networking opportunities that stirred the pot, in the best of ways. I was so focused on the rare chance to have vigorous roundtables with media colleagues - and network with fellow attendees across the globe - that I missed some notable sessions.
One was Open web = open world?, where a panel of tech luminaries took on a volatile/essential topic. Change is afoot. Big tech is falling under regulatory pressure, and social media supersites are taking unprecedented heat for their algorithmic/editorial/data decisions. But that doesn't mean a better Internet magically emerges.
If anything, I'd argue that online supersites are doing more to divide society than create a democratic/open model. Instead of a foundation for dialogue, we've built an algorithmic dystopia of obsessive consumerism and political radicalization via filter bubbles, hostile to any fact that undermines their divisive premise. Boy, I need some holiday cheer, don't I?
Dries Buytaert, founder and CTO of Drupal and Acquia weighs in
No one better to talk to than Dries Buytaert, founder and CTO of Drupal and founder and CTO of Acquia (billed as the "open digital experience platform for Drupal"). Buytaert, who appeared on the Web Summit Panel, is not what I'd call a naive optimist. Nor was it his job to cheer me up. But talking to someone who acts on these issues is always heartening. So, after the Web Summit concluded, Buytaert and I did just that. So how did the panel go? As Buytaert told me:
It was a panel discussion with Matt Mullenweg, the WordPress founder, and Mitchell Baker, who is a founder and CEO of Mozilla. All three of us have been doing open source and open web-related things for a couple decades. It was awesome to talk to them about the future of the web, and why it's important to fight for the openness of the web.
One pressing issue: push back against the dominance of the walled gardens. Buytaert continues:
A bit of the conversation was around, "Can the open web even still win?" There's such a strong force with the Facebooks, and the Googles - these handful of walled gardens. So much of the web has moved within those walled gardens right now. In general, the three of us felt that is a shame, because you lose so much. You lose so much of the serendipity and the creativity - because every Facebook page looks the same, right? Versus every website can look different.
Then there are the privacy implications, and the perils of trading convenience for data. I tried out my "web citizenship" stump speech on Buytaert. I believe that the algorithmic dumbing down of the web will continue, as long as individual users expect to be spoonfed whatever engages them in the most easy-reader interface possible.
The pervasive influence of big tech - do we need algorithmic oversight?
One hallmark of the classic open web: we took on the obligation of being educated super-users. We sought out spirited debates; we searched out obscure web sites that resonated. We logged in and posted thoughtful blog comments, rather than compulsively retweeting something nice about our employers. Just like voters need to be well-informed for democracy to work, an open web depends upon true citizens of the web - not just rabid consumption of products and video bites. Buytaert's reaction?
I haven't thought about it that way, but I do like that analogy. It's almost like the open web is the commons. And, you know, just like with any commons, we all have to contribute to the maintenance and the well being of the commons. I think that's what you're getting to with citizenship, and so that definitely resonates to me. We all have to do our part to keep our shared worlds safe, secure, fun, and healthy.
Buytaert sees historical parallels: as the commons rise, they become difficult for volunteers to maintain at scale. Government support can falter or fade; commercial alternatives rise. Now we're seeing the dark side of online empires that exceed the power of most governments. And that's where Buytaert diverges a bit, from my emphasis on personal responsibility. He's got a different question on his mind:
One thing that came up in our panel discussion was: do we need algorithmic oversights? And it ties back into: should the government have a role in this? And my stance was: yes. Because this is not new. Obviously, Google and Facebook, they're feeding content to the masses, as you pointed out, but the risk is in the bias in their content. They're actually able to, you know, manipulate is maybe too strong of a word, or maybe not.
But they're able to steer society's mindset based on what the algorithm actually decides. And there's a lot of examples of that already, right? It's more than the web too. People go to jail because of the results of DNA tests, which is software algorithms analyzing your DNA. How do we know that these DNA tests are correct? How do we validate that these algorithms that impact society, that they are correct with a capital C?
Buytaert used a historical analogy:
The parallel I made was with the Food and Drug Administration. If you go back a couple hundred years, everybody could literally make medicines in their garage, and sell them to other people. Obviously, that was a bad idea. And so the government stepped in created a Food and Drug Administration, or similar organizations in Europe. And they provided some governance around it to make sure that the impact on the rest of the country wasn't a disaster.
One important/tricky point that Buytaert is making here: he isn't just talking about regulating big tech companies. He's talking about some type of oversight over the algorithms themselves.
This begs the question, should we do the same thing with algorithms? I think our algorithms are comparable to food and drugs, in that they do need some oversight. Then I suggested to the panel, if you look at what the FDA does, if I buy a chocolate bar, I can literally see on the back of the chocolate bar - there is a nutrition label. In ten seconds, I can get a sense of, "How bad is this thing?"
Yes, we grant web sites access to our data, but we do it via elaborate terms-and-conditions that a layperson would struggle to pore through. Buytaert asks: is there a more transparent solution?
Should we have a similar kind of system for algorithms, where I can actually go to a site, and I can see in ten seconds, "Here's how my data is being used?" Right now, you have to go through these thirty-page user policies.
How do we spur organizations to action?
I'm pretty hostile to walled gardens, and algorithms that exert outsized control on what we see and consume. But demonizing big tech is simplistic. It does a disservice to how these problematic structures are built. Often with good intentions - though I'd argue the profit motive has spurred the worst aspects of viral content, along with whatever innovations came along. As Buytaert puts it:
That's the hard part about this. The first two points would seem like bashing these big technology giants a little bit, but I don't think they operate from malicious intent. For example, there was an article a couple years ago about how Google search results were borderline racist.
I don't believe that's what Google wants. I'm sure they were ashamed of that, and that they were quick to address it. So again, I don't want to bash them... These organizations have done so many great things - Twitter, Facebook, Google... I think Google and the others have done more good than bad. So again, I don't want to be the guy that's going negative Nelly about these companies, because we need a nuanced view here. You don't want to go back in time, either. We want those global platforms that give people a voice - that's important.
Okay, that's the predicament. But besides regulatory oversight, what else can be done? And what is the role of organizations - such as the type that read diginomica? Given Buytaert's extensive history with open source via Drupal's CMS, it's no surprise he sees a role for open source here.
We're big proponents of open source, obviously. Acquia also has a customer data platform, which is actually relevant in this, because it allows organizations to manage GDPR, for example, and put in place good data retention policies.
If your enterprise wants to be part of the solution, start by tackling user data transparency:
Most organizations are kind of a mess when it comes to their data. For organizations, the first good step is to get control over all the data you have, and then have the tools to help you manage your data. So we're trying to be an enabler. I mean, it's up to the companies themselves, obviously. We can't tell them what to do, but we can give them the tools to be better. What's the word? Better maintainers of data.
My advocacy of web citizenship stems from this: we can't be passive bystanders, waiting for benevolent supervisors or watchdogs to intervene.
But the role of regulation must be reckoned with. An individual can't do anything about a discriminatory algorithm. Regulating algorithms brings a slew of complicated legal and intellectual property problems - a reality not lost on Buytaert. I worry about regulation. I worry about the digital literacy of lawmakers (the notorious technical ignorance that's tainted the debate about security backdoors being one glaring example). Buytaert supports a regulatory push, but he acknowledges the concern:
I do feel like our governments have let us down on this; they're late to the party. We've been talking about this for so long, and I expect more. It's also why, like you, I'm a little bit nervous. I don't know if that was the right word, but I'm worried about that government influence, because they don't have the best track record so far.
I also worry about ideological groups abandoning the giant sites to form their own extremist filter bubbles. That's not the open web either. If one aspect of the open web is embracing that "super-user" ability to seek out information, the other is intellectual curiosity - the willingness to challenge your own positions. If Facebook splintered into a handful of ideologically homogeneous filter bubbles, that wouldn't help society at all.
It's not a pretty picture, nor one of holiday cheer. But it's good to think about how we can bring ethical sensibilities into our work aspirations - rather than having irrelevant side conversations while the sand slips out the hourglass. Yes, politics ruins the water cooler. But: if our organizations can impact these issues, perhaps that can be a motivating force. Hopefully we can spark that motivation on these pages in 2021.
Updated December 17, 7:00am PT, with minor tweaks for reading clarity. I plan on adding a few more links to the piece as well.