Main content

UK Online Safety Act – children’s champions speak out

Chris Middleton Profile picture for user cmiddleton February 15, 2024
Summary:
Child protection is full of heroic individuals who are committed to standing up for young people – often after tragedy strikes. But who or what could give them a louder voice with tech giants?

An image of a group of teenagers looking at their phones
(Image by Przemysław Trojan from Pixabay)

The murder of trans teenager Brianna Ghey in 2023 shocked Britain and highlighted an increasingly hostile online environment. Not just for LGBTQ+ people, but also for young people in general – and for anyone who is vulnerable. Brianna was lured to a meeting with her 15-year-old murderers in a pre-meditated attack, for which they were jailed in December.

Writing in the Guardian newspaper this week, a year on from her daughter’s death, Esther Ghey blamed technology companies for “the mess” of the internet. She said Brianna would be alive today were young people not able to access violent material online, and were platforms like X not welcoming to pro-anorexia accounts and other extreme content (Brianna was hospitalized for an eating disorder in 2022).

The Guardian report added:

[Esther Ghey said] tech bosses were also culpable when it came to the wave of anxiety and mental health problems affecting children, which she said had led to ‘a complete lack of resilience in young people’.

She said tech companies should reflect not just on Brianna’s murder, but also ‘the amount of young people that have taken their own lives’ as a result of their harmful experiences online.

According to the UK’s Samaritans support group, four children kill themselves every week in the UK, while US website DoSomething.org reports that 37% of all teenagers have been bullied online – most more than once – a figure that rises to half of LGBTQ+ teenagers. Interestingly, both perpetrators and victims, overall, are more likely to be girls, notes the organization, exploding any belief that such bullying might be rooted in a macho or tech-bro culture.

Unfortunately, the harsh environment for people like Brianna Ghey is increasingly reflected in Parliament. At Prime Minister’s Questions last week, Rishi Sunak was criticized for making a jibe about trans people while Esther Ghey was present in the House of Commons – a moment of appalling insensitivity in support of a cheap political point. 

A week later, the Prime Minister had still not apologized, but did invite Ghey to Downing Street to discuss online safety. Away from an increasingly scabrous Parliament, however, Brianna’s “amazing, unique and joyful” life was celebrated by a 1,000-strong vigil in her home town of Warrington. 

Brianna dreamed of being “TikTok famous”, her mother explained, but claimed that social media did her daughter more harm than good, preventing her from finding “her tribe” in the real world. So, without action from US Big Techs, Esther Ghey warns that more children will die or face severe mental health problems, while society becomes “less and less empathic”. 

She told the Guardian:

’They’re the ones that have created this technology. They’re the ones that have got us into this mess in the first place. And I think that it’s their responsibility to get us out of this mess. I think that they’ve got a moral responsibility to protect young people and to protect society in general, but especially our young people.’

Where does responsibility lie?

One suggestion – made by Ghey and others in recent weeks – is that under-16s should be prevented from accessing social media. A well-intentioned idea that, I would argue, overlooks the fact that for anyone who has grown up with the internet and smartphones, their devices are not mere gadgets, but lifelines to a world of communication and data. Indeed, for most young people, their devices are their networks. 

To withdraw access, therefore, seems likely to be seen by teenagers as adults punishing the young, rather than owning their own mistakes. This would deepen the sense of injustice that many young people feel about the state of the planet, the economy, and their rising debts and uncertain prospects.

Even so, children’s welfare online is front and centre of the UK’s new Online Safety Act, which was the subject of a Westminster policy eForum last week. 

It is fair to say that the Act has not been well received by everyone in the tech community, with fears expressed that it may – ironically – encroach on everyone’s online safety by casting encryption as the enemy in a political drama, rather than the enabler of online trust and secure transactions. Indeed, encryption can also protect vulnerable people, such as whistle-blowers or people in oppressed minorities and communities.

That aside, what was the eForum’s view on the Act’s broad aims?

The first observation from the event is that, while online bullying may not be the preserve of boys and men, women seem to be leading the charge against an IT world that is not only overwhelmingly male, but also seemingly unwilling to act to keep users safe. Eleven of the event’s 14 speakers were women, inverting the usual mix at tech conferences – a welcome change of atmosphere and perspective. 

Very much in the hot seat going forward is Jessica Smith, Principal for Online Safety at the UK’s communications regulator, Ofcom. She said that clear rules for the protection of children online will be set out by early 2025.

Then she explained:

Ofcom is not designated as a content regulator, but instead is the overall overseer of the trust, safety systems, and processes that services now need to put in place in order to meet their duties of care to their users. 

There are three main parts of the regime. The illegal content duties, which govern what services need to do in order to manage the risk of illegal content on their services and protect their users from it. 

There are other duties around content that is harmful to children. This is where if services are likely to be accessed by children – and that includes pornography – then they need to take steps to protect children from content that is harmful to them. 

And then there are the duties that fall on what we call categorized services, which are the largest and also most risky social media and search services, which have particular duties around transparency, and offering more choice to their users about what content they do and don't engage with.”

She added:

The way we intend to deliver our objective is by establishing standards for effective risk management and good practice, by driving industry improvements through engagement, and by having our comprehensive regulatory rules and blueprints for industry action. And when that is not happening, by holding industry to account.

In particular, she said, Ofcom wants online services to have appropriate governance and accountability processes to assess the risks to all users, and particularly to children. 

However, she acknowledged that users have responsibilities too:

We want UK users to be better informed about what the risks to them are from online services, and how they can take steps to protect themselves. But also, what the platforms’ responsibilities are towards them as well, so that when they report harmful content, they can expect action to be taken, and to know what is and isn't allowed on those services.

Even so, platforms should clearly, consistently, and transparently prioritize user safety, she said:

It should be harder for extremists to disseminate terrorist content or violent content. It should be harder for fraudsters to use online services to scam vulnerable users. And it should be harder for abusers to find and disseminate CSAM [child sex abuse materials], and for children to interact online with adults that they don't know. 

Services should also take steps to prevent children being recommended egregious suicide or self-harm content as part of their normal social media feed.

Few would argue with any of that. 

Scare tactics aren’t the best option 

One speaker who knows more than most about online harms to children was Lorin LaFave, Founder of the Breck Foundation. 

The organization is named in honour of her 14-year-old son Breck Bednar, who was groomed online and murdered in 2014 by an 18-year-old he met through a gaming forum. Not only that, but LaFave's daughter was subsequently harassed and taunted by online attackers with disturbing messages and photos concerning her brother’s murder.

Like Esther Ghey, LaFave is a children’s champion, turning her own loss and trauma into a positive campaign to prevent harm to others. She explained:

I believe that if I had heard someone like me speaking about the signs of grooming and exploitation, then I would have been better able to save Breck from being groomed and lured to his death at only 14 years old. 

Since speaking with thousands of children in schools over the past decade, and being a strong advocate for educating and empowering children to be more digitally resilient, I’ve learned that we need to keep them and other vulnerable users at the heart of this conversation today, and going forward.

LaFave argued that scare tactics are not the best way to make vulnerable users feel involved. (Arguably, many may be attracted to an online world that allows them to express themselves freely, where perhaps they are unable to so easily in real life):

Positive discussions surrounding online safety are what we need to help young people to avoid ignorance and fear. We want them to have the knowledge and the skills they need to keep safe.

At the Safer Internet Day event at BT Tower [last week] – where the theme was ‘inspiring change’, how perfect is that? – we heard from children who said they wanted the Act to be more accessible and understandable for them. 

We need to ensure that children understand how the Act is affecting and assisting them. It shouldn't feel like a secret that only certain adults can decipher. And so, the more that we can help them be a proactive part of the solution, the more responsible they will be.

She added:

This is true for parents too, who say they feel a bit in the dark. They hear time and time again in the media how the Act will better protect their children. But for them to be engaged, they need to have a simplified summary or a ‘quick read’ of what the Act is doing.

With over 300 pages of dense text, that is good advice. And this applies to young people too, she said:

One of the issues with children keeping themselves safe online is they often don't have the sense of how what they are taught in school relates to what their real online world looks like. 

They are repeatedly told to block and report harmful content or abusive users. But at the end of the day, they don't feel seen or heard, and say there is no point in even trying – because nothing ever happens, and nothing changes.

Being let down

Sadly, that experience is familiar to many adults too. Countless people, including me, have reported hateful homophobic, antisemitic, Islamophobic, racist, misogynistic, or fascist content on platforms such as X and received an instant response that a post was not in breach of guidelines. On one recent occasion, a complaint I made was upheld – but only three months later, during which the tweet remained online. The subtext: don’t waste your time, says Big Tech.

LaFave continued:

[Young people] feel let down and they may give up trying [to report abuse], thus making their online worlds more precarious. When we work so hard to educate children on what is right and wrong on the internet, we need to also ensure they're listened to when they do act on our warnings and recommendations.

An excellent point. When LaFave’s daughter and her friends experienced horrifying attacks related to Breck’s murder, they “struggled to get the tech giants to listen”, she said.

There were pictures of coffins, Breck’s actual grave, and accounts of his rape and murder that she did not know about – she had been too young to hear such sexual and sadistic content at the time, due to her own age. 

The criminal continued to open new profiles to reach her, and the messages blamed and slandered Breck. But her school was proactive and immediately brought her to the local police station. We tried to explain the situation to the police, but the department had no working knowledge of how the apps and platforms that children use actually work. It felt as if they just shrugged their shoulders in confusion. 

However, the biggest setback was that the investigation took years to gather the data to find out who had sent these harassing and upsetting images, because the tech companies made it as onerous as possible.

My take

Aside from the issues raised by the Online Safety Act itself – including any risks it might, inadvertently, create in terms of undermining public trust in encryption – perhaps the key issue is this: who or what can be a children’s champion online?

The experiences of people like Esther Ghey and Lorin LaFave reveal that it often takes a tragedy to highlight the failings of the tech-enabled world – failings that are often then thrown back in the face of vulnerable people. 

This is particularly true at a time when everybody shouts and few people listen, and an economy of hate is flourishing among people who can make a name for themselves by attacking and belittling others.

Trying to make platforms that have millions, or billions, of users pay attention to the problems of an individual is an overwhelming challenge, of course. So, perhaps what is needed is an organization with the equivalent power of an Ofcom to be young people’s champion online. But at international, as well as national, level. 

A platform that can give them a voice as loud as a tech giant or a billionaire CEO – one that can’t be ignored. The UN Convention of the Rights of the Child mandates that ratifying states must act in the best interests of the child. So, why not tech giants?

 

Loading
A grey colored placeholder image