Government should force social media companies to hand over data for research

Profile picture for user ddpreez By Derek du Preez January 17, 2020
Summary:
A new report released by the Royal College of Psychiatrists also argues that a government tax on social media companies needs to go further.

Image of someone using Instagram and social media

Social media companies should be forced to hand over their data to universities for independent research into the risks and benefits of social media use, argues a new report published by the Royal College of Psychiatrists (RCP). 

The RCP recommends that the government should use the opportunity of its recently announced independent regulator for online safety to compel social media companies to hand over anonymised data to better understand the benefits and harms on children and young people. 

In addition, the report released today also argues that the government’s planned ‘Turnover Tax’ on technology firms should go further and be applied to the international turnover of social media companies. This could be used to fund research and training for clinicians, teachers and others working with children and young people. 

The government recently released its Online Harms White Paper, which is under consultation. The White Paper set out plans for a new independent regulator to be introduced to ensure online companies protect their users and and face tough penalties if they do not comply. 

It proposed a mandatory ‘duty of care’ for companies and the regulator will have enforcement tools to not only issue fines, but also block access to sites and potentially impose liability on individual members of senior management. 

However, the RCP wants to see the government go further, forcing the handover of anonymised data, stating that an “understanding of the content with which children and young people are engaging is essential”. 

The RCP states that the challenges posed by social media to the mental health of children and young people has “exploded in recent times”, citing the tragic suicide of teenager Molly Russel, who died after viewing harmful content online. Her father, Ian, backs the College’s report. 

Ian Russell, who also authored the report’s foreword, said: 

Two years ago Molly’s suicide smashed like a wrecking ball into my family’s life. I am in no doubt that the graphic self-harm content and suicide encouraging memes on Molly’s social media feeds helped kill her.

Without research using data from social media companies we will never know how content can lead our children and young people to self-harm or, in the most tragic cases, take their own lives. The government must enact these calls from the Royal College of Psychiatrists.

Key recommendations for government

The RCP outlines a number of recommendations for the government to consider when thinking about the impact of social media on young people’s mental health and establishing its new online harms regulator. These include: 

  • Ensuring the regulator urgently reviews and establishes a protocol for the sharing of data from social media companies with universities for research into benefits and harms on children and young people.

  • An urgent review of the ethical framework for using digital data. The RCP states that the same standards need to apply as in other areas of research.

  • Government funding for follow up of NHS Digital prevalence study to examine the impact of social media on vulnerable children and young people over time

  • To instruct the regulator to establish a levy on tech companies proportionate to their worldwide turnover. This would be used to fund independent research and training packages for clinicians, teachers and others working with children and young people. As with the gambling industry and social responsibility measures, the gaming and social media industry should be required to increase social responsibility measures similarly, such as emulate the gambling industry’s duty of care practices (e.g. personalised behavioural feedback, stop messages) in gaming/social media platforms.

  • Enabling the regulator to undertake a joint review with the UK Gambling Commission to review regulation regarding loot boxes in line with other countries which have recognized loot boxes as a form of gambling.

  • Undertaking a consultation in 2020 on a yellow card warning system similar to that used for medicines, in order for professionals and potentially parents/carers/young people to report harms of social media and gaming companies.

  • Prioritising the strictest enforcement of Data Protection law and in particular UK DPA 2018 “Age appropriate design” to services targeting and / or popular with children, including that services should default to assuming users need child protection until explicit action is taken to opt out.

Dr Bernadka Dubicka, chair of the child and adolescent faculty at the Royal College of Psychiatrists and co-author of the report, said: 

As a psychiatrist working on the frontline, I am seeing more and more children self-harming and attempting suicide as a result of their social media use and online discussions. 

We will never understand the risks and benefits of social media use unless the likes of Twitter, Facebook and Instagram share their data with researchers. Their research will help shine a light on how young people are interacting with social media, not just how much time they spend online.

Self-regulation is not working. It is time for government to step-up and take decisive action to hold social media companies to account for escalating harmful content to vulnerable children and young people.

Recommendations for technology companies

In addition to the recommendations made to government, the College also made some recommendations to technology companies. These include:

  • Social media platforms should flag up engagement with risky content and operate and offer a free direct hotline for at-risk or vulnerable individuals.

  • Social media companies should provide user configurable controls (not in the cloud) that can block incoming content of the young person’s choosing (by default ‘full safety measures on’), and provide feedback on content they are planning to send.

  • Social media companies should promote and contribute to mental health charities in home countries to support any vulnerable individuals

  • Gaming companies and social media platforms should regularly fund research related to their products, to be conducted by independent external bodies and provide on a regular basis user data for research purposes to academic institutions.

  • Funding of media literacy awareness campaigns.