UK’s Online Safety Bill - not robust enough to to tackle illegal content, nor does it protect freedom of expression

Derek du Preez Profile picture for user ddpreez January 24, 2022
Summary:
MPs on the Digital, Culture, Media and Sport Committee issue a double blow to the government’s landmark Online Safety Bill.

An image of someone using a mobile phone
(Image by NeiFo from Pixabay )

Hailed by the British Government as a commitment to make "the UK the safest place in the world to be online", the Online Safety Bill has been dealt a double blow by MPs on the Digital, Culture, Media and Sport (DCMS) Committee today in a new report that argues the draft legislation is neither robust enough to tackle illegal content, nor is it sufficient enough to protect freedom of expression. 

In other words, in its current form, the Online Safety Bill leaves internet users in the UK with the worst of both worlds. And this goes to the heart of the legislation's issue: governing what is harmful online is often subjective and we need to be very careful about who we give power to to control content moderation. 

There are of course some clear cut cases - child pornography, for instance - and those need to be addressed. But there is a real risk of creep with legislation that intends to monitor all forms of expression online. 

In its current form, according to the Committee, the Online Safety Bill also ignores types of content that are technically legal, but often extremely harmful - such as ‘breadcrumbing', where child abusers leave signals online for fellow abusers to find content; or nudifying images of women online with deepfake technology. 

Meanwhile the current legislation would give the Secretary of State powers to decide what is ‘legal but harmful' - potentially leaving decisions around what should be taken down at the whim of who is in charge at the time. The report released today by the Committee argues that some powers given to the Secretary of State should be removed, including the power to modify Codes of Practice and give guidance to the communications regulator Ofcom. 

Protecting freedom of expression online is critical. As the report notes, this can have consequences for our democratic institutions. It states: 

We recommend that, in addition to the factors listed above, the definition for content that is harmful to adults should be further clarified to explicitly account for any intention of electoral interference and voter suppression when considering a speaker's intentionality and the content's accuracy, and account for the content's democratic importance and journalistic nature when considering the content's context.

As such, the Committee is calling on the Government to introduce a ‘must balance' test that takes into account whether freedom of speech has been protected enough when decisions around removing content are taken. 

Is it even workable? 

And then of course there are the practical problems too. How does a single nation influence what is and isn't allowed on the Internet? For example, the European Union - of which the UK is no longer a part - is currently developing its own legislation (in the form of the Digital Services Act). If there is too much divergence between the EU's legislation and the UK's legislation, which do you think the tech giants are going to listen to? 

It would be more sensible, in my mind, to undertake an international, diplomatic approach to creating such legislation, in order to ensure that the UK's intent to moderate illegal content is not ignored. There's a risk, as the privacy campaigners the Open Rights Group note, that otherwise the Online Safety Bill will "simply break the Internet for British users". 

However, the Committee does recommend that Ofcom be given the power to conduct "confidential auditing or vetting" a technology company's systems to assess the operation and outputs in practice. Alongside the power to request generic information about how content is disseminated by means of a service, the government, it notes, should include a non-exhaustive list of specific information that may be requested, subject to non-disclosure, including: 

  • The provider's objectives and the parameters for a system's outputs, (such as maximising impressions, views, engagement and so on);

  • Their metrics for measuring performance and references of success;

  • The datasets on which systems are developed, trained and refined, including for profiling, content recommendation, moderation, advertising, decision-making or machine learning purposes;

  • How these datasets are acquired, labelled, categorised and used, including who undertakes these tasks;

  • Data on and the power to query a system's outputs, including to request or scrape information on said outputs given particular inputs.

The Committee is also recommending that the online safety regime should require providers to have designated compliance officers, in order to "bake compliance and safety by design principles into corporate governance and decision making". 

This all surmounts to the government seeking to get under the hood of how technology companies run their moderation systems, which are likely skewed towards engagement rather than protection. The technology companies in question, whilst making promises of investment, have been incredibly resistant to sharing their system secrets. 

Commenting on the report, Chair of the DCMS Committee Julian Knight MP said:

In its current form what should be world-leading, landmark legislation instead represents a missed opportunity.

The Online Safety Bill neither protects freedom of expression nor is it clear nor robust enough to tackle illegal and harmful online content.

Urgency is required to ensure that some of the most pernicious forms of child sexual abuse do not evade detection because of a failure in the online safety law.

These are matters of important public debate to which we will return as the Bill makes its way through Parliament.

My take

wrote last week about the government's campaign attempts to sway the public's opinion that the Online Safety Bill should be used to limit the use of end-to-end encryption messaging services, which highlights how this legislation could creep into important privacy matters. Illegal and harmful content online should of course be tackled, but I'm incredibly wary of granting legislative superpowers to the government that don't protect freedom of expression sufficiently. Independent checks and balances need to be introduced, with significant control and power given to individuals. Rushing this legislation through could have untold consequences down the line and we need to be very careful. 

Loading
A grey colored placeholder image