Communications regulator Ofcom faces multiple challenges in making the UK the ‘safest place in the world to be online’

Derek du Preez Profile picture for user ddpreez February 21, 2024
Summary:
According to MPs on the influential Public Accounts Committee Ofcom has a delicate balancing act ahead of it as it grapples with the practicalities of the Online Safety Act.

An image of a man sitting in the dark on their laptop
(Image by Robinraj Premchand from Pixabay )

One of the British Government’s key pledges is to make the UK the ‘safest place in the world to be online’, following the introduction of the Online Safety Act in October 2023. The legislation is controversial, with proponents hoping that it will lead to a safer environment for children that are consistently exposed to harms online, whilst detractors argue that it is unworkable and fear that it could open the floodgates for the government to insert workarounds to encryption. diginomica analyzed the likelihood of the Act’s success in a recent piece, after a number of children’s champions spoke out about what needed to be done. 

And this week MPs on the influential Public Accounts Committee (PAC) outlined in a report both the opportunities ahead for the UK with the introduction of this new legislation, as well as the practical hurdles that lie ahead for communications regulator Ofcom. As the report notes, of internet users in the UK, 68% of child users (aged 13–17), and 62% of adult users (aged 18+), indicated that in 2022 they had experienced at least one potential online harm in the last four weeks. Harmful content could include anything from from child sexual abuse material and terrorist content to online fraud and the encouragement of self-harm.

The Act itself places new duties on search engines, firms which host user-generated content, and providers of pornographic content, with the aim of minimizing the extent of illegal content and content that is harmful to children. Providers failing to meet these new duties will be accountable to Ofcom, in its new role as the UK’s online safety regulator. 

However, the PAC report highlights a number of areas that Ofcom will find challenging in its attempts to make the Online Safety Act workable, including: funding, managing the public’s expectations, and introducing new automated systems in order to help it deal with complaints. 

Commenting on the Committee’s findings, Dame Meg Hillier MP, Chair of the Committee, said:

Expectations are understandably high for firm guardrails in the hitherto largely unregulated online world. We know that around two thirds of UK children and adults say they experienced at least one potential online harm in a month in 2022, according to Ofcom, which is to be commended for how swiftly it has moved to take on its new responsibilities.  It must now continue to be proactively frank with the public over what the Online Safety Act does and does not empower it to do, lest confidence in the new regime be swiftly undermined.

Firm detail on how fees for industry, enforcement, automated monitoring and a range of other issues must now be locked in. No other country has introduced equivalent online safety regulation. Ofcom now needs to capitalize on its early progress. It must also accelerate its coordination with other regulators both at home and overseas, in the recognition that it is at the forefront of a truly global effort to strike the right balance between freedom and safety online.

A difficult path to online safety

Whilst the Committee highlights that the UK is world leading in the nature of this regulation and its ambitions, it does a comprehensive job of pointing to a number of areas where Ofcom will struggle to achieve what it has tasked with. 

For instance, whilst praise is given to the regulator for preparing in advance of the legislation and quickly implementing systems to tackle key problem areas - such as illegal harms and protecting children from pornography - the Committee notes that according to its own roadmap, as of December 2023, has only issued 10 of the 54 documents it needs to produce to implement the full regime. 

Further to this, full implementation has slipped by a year from 2025 to 2026 (a good indicator that things could slip further, particularly given we are in an election year and who knows what will happen if there‘a a change in government). 

Second to this, but closely related to the first concern, is that there’s a good chance that the public may be disappointed (and grow disillusioned with the whole agenda) if people cannot quickly see improvements to their online experience or understand how their complaints are acted on. 

The PAC explains that the regulatory regime will not be fully implemented until 2026, which means that there is a risk to public confidence if there are not tangible changes to people’s online experiences. Part of the problem here is that Ofcom has yet to implement a system that provides feedback to individuals who complain to the regulator about illegal or unsafe content they discover online. 

Individuals are required to complain to service providers in the first instance, but if they remain concerned that’s the point at which they can complain to Ofcom. A contact center has been set up to review such complaints, but as of yet it is not able to act on them individually. Instead Ofcom will review complaints alongside its normal monitoring data to decide if further action is needed - but even if action is taken, it doesn’t have a feedback mechanism for individuals to let them know what impact their complaint has had. 

Equally, Ofcom estimates that there could be 100,000 or more service providers subject to the new regulation, with many of these being small businesses and/or based overseas. Ofcom is planning to use automated processes to identify and collect monitoring data on the compliance of most service providers, but doesn’t yet have these processes or systems in place (so the plan to do so is still very much theory). 

Ofcom has had some early success by engaging with an overseas service provider directly, For example, a service provider that hosts content encouraging suicide agreed to block site access for UK users after being contacted by the regulator. But if a service provider overseas decides not to comply it becomes a more difficult situation for Ofcom, relying on using powers to fine (including up to 10% of a company’s global revenue) and other enforcement powers. 

The Committee is urging Ofcom to urgently finalize its automated compliance monitoring systems and clarify its enforcement approach with service providers where engagement has not proved possible. 

According to the report, Ofcom also has a funding challenge, having not worked through the detail of how fees levied on industry will work, including how it will recover the set up costs and cover the ongoing costs of the regime (quite critical, you’d think, if it wants to make a success of the programme). The PAC states: 

Delays to the Online Safety Bill’s passage through parliament mean that introduction of the fee regime has been pushed back from 2025–26. Fees will begin in 2026–27, covering Ofcom’s ongoing costs, with recovery of set-up costs starting in 2027–28. Until the fee regime starts, Ofcom’s costs for online safety will be met from existing arrangements whereby costs are funded from Wireless Telegraphy Act 2006 receipts which would have otherwise gone to the Exchequer. 

Ofcom has yet to establish the details of how the fee regime will operate, such as the payment thresholds, fee structure and period for recovery of set-up costs. Establishing details of the regime is set to take some time and Ofcom may not recover all its set-up costs until 2032–33. In establishing the fee regime, Ofcom aims to balance competing objectives of fairness and proportionality, administrative efficiency, and recovery of costs-only. It also plans to provide transparency to industry about the annual fees.

My take

As is often the case with ‘big idea’ legislation, particularly as it relates to regulating the Internet, more often than not the government campaigns for some voter-friendly ideas (e.g. protect children online), without thinking through the practicalities of how that will actually work. That’s now Ofcom’s job and one that will involve steep, practical challenges. 

No one is arguing that the intentions of the Online Safety Act aren’t good or noble, but regulating content online is notoriously challenging (particularly when carried out in isolation, without a global/regional consensus). Whether or not service providers based in the US will listen to the UK, which is no longer part of a much larger trading bloc, remains to be seen. 

And it’s becoming clearer that this challenge will only be accelerated as we experience more AI generated content online, with deep fakes becoming easier and cheaper to produce. 

I’d argue that Ofcom’s biggest opportunity is to work with other international regulators and inspire them to collaborate with the UK and align on values and principles. Working together to tackle harmful content online - a good goal - will be easier with others lined up behind you.

 

Loading
A grey colored placeholder image