Regulation or hesitation - how to manage the internet? The story so far...
- Summary:
- A wide-ranging debate in London raises questions that apply on a global scale.
That was one of the themes debated by a panel of academics, business leaders, and policymakers at a Westminster eForum event in London, ‘Next Steps for Online Regulation’.
One discussion stream looked at the road travelled so far and at what progress, if any, has been made. It was chaired by Baroness Kidron, Member of the House of Lords and Chair of the 5Rights Foundation, an organisation that articulates children’s rights online and believes that kids remain children until they reach maturity, and not until they receive their first smartphones.
So what are the issues?
Big Tech rising
We’ve all become aware of the power of a small number of online platforms over every aspect our lives. So much so, that hostile nations and so-called bad actors have sought to use them to hack our democratic processes. Or are they just accessing the same open platforms legitimately, to express alternative views? Increasingly, systems that seek to bring clarity to our lives are stranding us in an ethical fog. Baroness Kidron observed:
Rather than ‘on and off’, we live in a world where we are augmented and mediated by technology. [...] Even those who are offline could rightly claim that their lack of access is a defining aspect of their lives. [...] The power and balance between the gatekeepers of the digital world and the public is profoundly unhelpful for society and for the economy.
In short, the stranglehold of a few companies over public discourse is undermining social norms, she said, which is why we need a “robust authority” to anticipate the future.
Until the digital world retrofits UN-defined principles such as children’s right to privacy onto its platforms, it is not fit for purpose, she added. Indeed, she believes the mega-success of a handful of companies is a roadblock to change and innovation in this field, echoing a view expressed recently by the Web’s prime mover, Sir Tim Berners-Lee.
Elaine Quinn, Director of Corporate Affairs at Nominet, joined Kidron in lamenting the loss of TimBL’s vision of an open, fair, and equitable Web, and said that tackling the problem demands collaborative solutions. Kidron added:
Nonetheless, it is already a victory that we are gathered here today, convinced that something should be done. We need to be ambitious and creative in carving a path to a world we want, particularly for the flourishing of children.
Andrew Puddephat, Chair of online safety charity the Internet Watch Foundation, gave an impassioned plea to protect children online, explaining that the horrific abuse of minors for adult gratification needs to be tackled in new, more interventionist ways. Instead of looking at why illegal porn exists, authorities need to consider why men – up to 100,000 of them in the UK alone – seek out abusive material.
There is “a problem with men” and how they behave online, he said. The solution is to disrupt demand, rather than try to stop content being produced to satisfy it, Puddephat suggested. Counselling avatars’ that explain the harm caused by looking at illegal material could be part of the solution, said Puddephat. Illegal porn aside, he added:
It’s not for us to decide what members of the public should and shouldn’t see. We should cherish and value the internet. For me the question is how do you manage the sewage?[...] I see no reason why internet companies shouldn’t be regulated. But regulation should be outcome based and not tell companies what to do, and companies should provide a mechanism and accountability for how they are fulfilling those challenges. [...] But unfortunately, you often penalise the good actors and let the bad actors escape.
In an angry aside, Puddephat claimed that the IWF’s attempts to identify abused minors and trace the source of illegal material are now being hampered by GDPR – spurring Kidron to defend the regulations.
Europe waves goodbye
A planned highlight of the conference was a keynote from Julian King, European Commissioner for the Security Union, but in the event he was unable to be there in person. Instead, he appeared – impeccably dressed and with EU flags fluttering behind him – in a pre-recorded video from Brussels, in which he set out the case for forging a united, secure Europe online, with “credible disincentives” for bad actors, and moves to improve law enforcement on electronic evidence, cyber sanctions, and cyber attribution.
But as videos of terrorist atrocities become clickbait for publishers and other attention-seekers, many people now ask where the dividing line is between content that’s illegal and content that’s ‘merely’ harmful – especially to children and other vulnerable people. (Aren’t we all vulnerable?)
Nominet’s Quinn observed that legal but harmful content is much harder to deal with than material that breaks the law when it’s created or distributed. The challenge is how to define what’s unacceptable: is the duty of care loose or specific, and is it a matter for the public or private sectors?
Yet terrorist content is widespread and has had a role “in every attack on European soil”, continued King. So far, internet companies’ attempts at self-policing have not made sufficient progress, so Europe moved last autumn to force platforms to take down offensive content within an hour of it appearing. Member states need enforcement powers to ensure this happens, he said.
What’s the intent?
So should we censor or censure? Or neither? And what about context and intent? After all, one man’s search for violent titillation online may be another’s legitimate research into human rights abuses. Any URL might reveal something shocking, but not why it was viewed or published in the first place, which is a real challenge for blanket surveillance systems.
But some are actively using the internet to subvert democracy, he said; in King’s view, there is now a clear and established pattern of attempted influence on our democratic processes. As a result, we need to better protect elections, and set up a rapid alert system online to map and react to organised attacks.
Genuinely fake news abounds and internet platforms must step up and take action on a code of practice against disinformation, while being transparent about sponsored content, and establishing clearer rules on bots and clarity on algorithms, he said:
It’s not about censure, but about transparency. Unfortunately, Facebook and Google, the information they provide has been patchy and progress has not been sufficient. We’re saying, ‘Come on, you need to go faster and further in these efforts, and to meet the commitments you signed up to in these processes.’ We do need to look at regulation in this area.
Protecting the internet is also about securing our digital infrastructure, he acknowledged – a point also made by Ofcom’s Group Director of Strategy and Research, Yih-Choung Teh, who observed that 70 percent of IoT devices are manufactured in China and lack basic security controls.
Referencing 5G, Huawei, and China, King said the debate shouldn’t be about any single company or country, but about Europe strengthening its digital resilience. Do we want to continue seeing our own cutting-edge technology being sold off? he asked, rhetorically. And do we want to see a single dominant supplier reaching critical mass in any territory? He said:
We are committed to maintaining a digital world that is open, free, and builds in security, rather than bolts it on somewhere down the track.
But the unanswered question is why Europe is failing to hothouse internet giants of its own, rather than trying to break up the ones that already exist.
The question of belief
Another challenge to the concept of an open, global internet is that there is no universal set of human beliefs – an issue largely absent from the conference. In the West, there’s widespread legal acceptance of, for example, gay rights, women’s reproductive rights, gender and racial equality, religious freedom, and atheism. But in other countries, those concepts are actively condemned – and in some instances, even regarded as capital crimes or terrorist acts.
And, of course, we have our own share of people who want to hang and flog people too, and some believe their own propensity for hatred – and their implicit love of ritualised violence – as a bizarre form of patriotism.
In the West, we perhaps believe that our socially liberal, tolerant beliefs will naturally dominate in some utopian future of equality for all. It’s a beautiful idea, but China, Russia, Saudi Arabia, and others, disagree. Indeed, it’s questionable whether the current US government shares those beliefs either, while Brexit has revealed the UK to be far from united on a huge number of issues.
At the centre of all this sit a handful of social networks designed and managed by dysfunctional Californian geeks. Those platforms have become Harry Potter-style sorting hats for people’s beliefs and preferences, with both sides believing themselves to be in Gryffindor, and their opponents to be in Slytherin – a process amplified by algorithms that trawl our 'Likes' to sell us advertising. 'Hey man, let’s all get on,' say the owners, as they walk among the corpses counting their cash.
Some internet companies are pushing customers towards encrypted communications, partly – they say – as a reaction against government attempts to open back doors and access customers’ conversations. Or are they simply deploying encryption to shut the regulators out from their own private business? Answers on a postcard, please.
The big questions are: should private companies be our moral or legal arbiters, when they primarily represent their shareholders? Isn’t the legal dimension the proper function of government? And how can the future be brighter than it looks today?
My take
At heart, this entire debate was about friction. The digital world is all about removing friction from everyday tasks, though many seek out analog alternatives because they find pleasure in a slower, deeper, more handmade world – and are prepared to pay for such experiences (imagine!).
The challenge for regulators is how much friction can be introduced into online platforms before they break, as we all seek out instant gratification and, increasingly, value content by the speed at which it moves, rather than whether it is accurate, useful, or good for us. We’ve all become lazy button pushers – dumb, pleasure-seeking apes on a disintegrating rock in space.