British Government launches ‘scaremongering’ Saatchi PR blitz over end-to-end encryption concerns

Derek du Preez Profile picture for user ddpreez January 18, 2022 Audio mode
Summary:
Privacy campaigners argue that the government is trying to scare people into giving up valuable end-to-end encryption features that support online privacy.

Image of someone on their smartphone
(Image by Ramon López Calvo from Pixabay )

Launching with a website that has a ticker showing that 14 million suspected child sex abuse reports could be lost every year if social media companies implement end-to-end encryption across all their online chat features, the British Government's PR campaign with M&C Saatchi over restricting back-door access to digital messages is clearly seeking to play to the public's worst fears. 

The move is thought to be in response to Facebook's (now Meta's) plans to deploy end-to-end encryption across Facebook Messenger, bringing it inline with other messaging apps such as WhatsApp (also Meta-owned) and Signal. 

The M&C Saatchi campaign, according to a Rolling Stone freedom of information request, is thought to have cost taxpayers £534,000 - being backed by the Home Office. 

End-to-end encryption (E2EE), however, plays a critical part in the online security toolbox, whereby only those that send and receive a message can read its contents - not government, law enforcement or the app owners. Privacy is at its core, which is why it has broadly been popular with the technology industry, privacy advocates and consumers. 

The best way to think about it is using the Post Office analogy. In the physical world, where letters can be sent between two parties, the ‘sorting office' becomes a valuable place for government and law enforcement to identify bad actors (child abusers, criminals etc). However, in the online world, this ‘sorting office' is no longer a physical location where security, checks and balances can be introduced - it sits in the digital sphere and would be subject to interception from cyber criminals the world over. 

In other words, if there's a back-door for the digital sorting office for governments (with all the good intention in the world) to catch child abusers, that back-door exists for *everyone* to take advantage of. Which is why E2EE is seen as the better option - and there is currently no good technological alternative that caters to both the privacy needs of users and the concerns of governments wanting to intercept criminal communications online. 

The highly emotive campaign is being supported by a number of charities and organizations that include Barnardo's, The Lucy Faithfull Foundation, The Marie Collins Foundation and SafeToNet. The highly emotive campaign website currently states: 

Child sex abusers use social media platforms to exploit children and share images and videos of children being abused with other offenders.

Right now, some social media companies can detect child sexual abuse material being shared on their platforms and report it to law enforcement. This plays an important part in stopping child sex abusers, and these companies deserve to be praised for this.

But some are planning to introduce end-to-end-encryption, which scrambles messages so that only the sender and receiver can see what is being shared.

This means they will no longer be able to detect child sexual abuse on their platforms and therefore won't be able to report it.

If these plans go ahead an estimated 14 million reports of suspected child sexual abuse online could be lost each year. This could have a catastrophic impact on child safety.

The campaign itself is urging social media companies to confirm that they will not implement E2EE until they have the technology in place to ensure children will not be put at greater risk as a result. It states that it is not opposed to E2EE, as long as it is implemented in a way that does not put children at risk. 

However, whether that is technologically possible remains to be seen (heavyweight security experts suggest it probably isn't, but more on that later). 

In October 2020, a joint statement was also signed by the governments of the UK, Australia, Canada, India, Japan, New Zealand and the United States that called on tech companies to work with governments to find solutions to this challenge. It states: 

We, the undersigned, support strong encryption, which plays a crucial role in protecting personal data, privacy, intellectual property, trade secrets and cyber security. It also serves a vital purpose in repressive states to protect journalists, human rights defenders and other vulnerable people, as stated in the 2017 resolution of the UN Human Rights Council. Encryption is an existential anchor of trust in the digital world and we do not support counterproductive and dangerous approaches that would materially weaken or limit security systems.

Particular implementations of encryption technology, however, pose significant challenges to public safety, including to highly vulnerable members of our societies like sexually exploited children. We urge industry to address our serious concerns where encryption is applied in a way that wholly precludes any legal access to content. 

The governments in question called on technology companies to:

  • embed the safety of the public in system designs, thereby enabling companies to act against illegal content and activity effectively with no reduction to safety, and facilitating the investigation and prosecution of offences and safeguarding the vulnerable

  • enable law enforcement access to content in a readable and usable format where an authorisation is lawfully issued, is necessary and proportionate, and is subject to strong safeguards and oversight

  • engage in consultation with governments and other stakeholders to facilitate legal access in a way that is substantive and genuinely influences design decisions

However, these requests for access undermine the fundamental principles of E2EE as they currently stand (even the technology companies themselves can't see the content of messages sent using E2EE). 

‘Technological cakeism'

Unsurprisingly, privacy campaigners have come out arguing strongly against the Home Office's latest PR blitz. For example, the Open Rights Group labels it a ‘scaremongering campaign' that will make the public less safe by undermining trust in secure messaging - rightly stating that it will help online predators, criminals, blackmailers and scammers. 

Jim Killock, Executive Director from Open Rights Group, believes that the M&C Saatchi is being used to soften up public opinion prior to amendments to the Online Safety Bill, which he argues would allow the government to install backdoors into E2EE messaging apps. Killock said: 

This crass campaign shows how desperate the government is to scare people into supporting the ill-conceived Online Harms Bill. If the government weakens encryption, it will only help predators, criminals, blackmailers, and scammers. The Online Safety Bill is not designed to prosecute criminals, but to delete egregious materials online. Their campaign is a shameful distraction tactic and wholly misleading.

The Internet Society's Director of Internet Trust, Robin Wilton also told Rolling Stone magazine: 

The Home Office's scaremongering campaign is as disingenuous as it is dangerous. Without strong encryption, children are more vulnerable online than ever. Encryption protects personal safety and national security.  What the government is proposing puts everyone at risk.

However, at the core of the issue is the technology itself. The most balanced response I've read on E2EE comes from Ciaran Martin, Professor at the University of Oxford, and former Head of the National Cyber Security Center. In a paper published on the topic, Martin argues that when it comes to E2EE, compromise is a very difficult thing to find - comparing it to ‘technological cakeism'. He states: 

One of the legitimate points about those concerned about end-to-end encryption is the fact that decisions which could affect huge swathes of intelligence and law enforcement capability is decided by the governing committees of gigantic American companies rather than national democratic legislatures. 

I have no answer to that, other than to say it cannot be dealt with other than as part of the wider debate about the power of Big Tech.

But let me, instead, turn the argument on its head for the purposes of the encryption debate.

The government is, in effect, demanding that the tech industry does something to keep access open (at the same time, of course, as demanding the highest possible levels of cyber security). The industry, backed by most of the relevant expertise, is saying that what the government is demanding is simply not possible.

Some experts say, in effect, that the government is arguing not against a policy decision, but against mathematics. The government's response is simply to assert that no, you are wrong, it is possible, and you should go away and do it.

Surely though, the onus is on the government, not the industry, to set out clearly and transparently how they believe these two seemingly irreconcilable objectives can be met in the same regulatory package?

As 2022 approaches, surely there are better ways to spend the time than ordering Facebook not to emulate the rest of the industry. Instead, surely they should set out detailed technical options for scrutiny and debate about how the two objectives can co-exist across technology as a whole, and try to win support for them?

In the end, Martin's personal view is that, despite sympathy for intelligence and law enforcement, he "cannot see how we, as a free, open and increasingly digital dependent society, would gain from such a decision". He adds that "if a suitable technical compromise solution that commands widespread industry and expert confidence cannot be reached, then security must win, and end- to-end encryption must continue and expand, legally unfettered, for the betterment of our digital homeland". 

My take

Emotive campaigns to win over the public through fear are incredibly patronizing - the public are more than capable of understanding both sides of this argument. As Martin states, throwing out E2EE isn't the answer in an online world that needs greater privacy, security and protection. The risks are too high. 

A grey colored placeholder image