This is the view of the Houes of Lords Communications Select Committee, which has released its much anticipated report, ‘Regulating in a digital world’.
The Committee warns that regulation of the internet is fragmented across over a dozen regulators at the moment and cautions against the power of ‘large tech companies’ that have effectively become gatekeepers to the internet.
The chairman of the committee, Lord Gilbert of Panteg, said:
"The Government should not just be responding to news headlines but looking ahead so that the services that constitute the digital world can be held accountable to an agreed set of principles.
“Self-regulation by online platforms is clearly failing. The current regulatory framework is out of date. The evidence we heard made a compelling and urgent case for a new approach to regulation. Without intervention, the largest tech companies are likely to gain ever more control of technologies which extract personal data and make decisions affecting people's lives. Our proposals will ensure that rights are protected online as they are offline while keeping the internet open to innovation and creativity, with a new culture of ethical behaviour embedded in the design of service."
The Committee states that whilst the internet has not become a “lawless Wild West”, there is a large volume of activity that occurs online that would not normally be tolerated offline. For example, misuse of personal data, abuse and hateful speech make the case for further regulation “compelling”, according to the Committee.
In addition, the digital world has become dominated by a small number of very large firms, which enjoy a “substantial advantage”, operating with an unprecedented knowledge of users and other businesses. The Committee notes that “without intervention the largest tech companies are likely to gain more control of technologies which disseminate media content, extract data from the home and individuals or make decisions affecting people’s lives”.
The report recommends the development of a comprehensive and holistic strategy for regulation. A new Digital Authority should be created to oversee this regulation, with access to the highest level of the government to facilitate the urgent change that is needed.
The Committee believes that the following 10 principles should be used to guide the development of regulation online:
- Parity: the same level of protection must be provided online as offline
- Accountability: processes must be in place to ensure individuals and organisations are held to account for their actions and policies
- Transparency: powerful businesses and organisations operating in the digital world must be open to scrutiny
- Openness: the internet must remain open to innovation and competition
- Privacy: to protect the privacy of individuals
- Ethical design: services must act in the interests of users and society
- Recognition of childhood: to protect the most vulnerable users of the internet
- Respect for human rights and equality: to safeguard the freedoms of expression and information online
- Education and awareness raising: to enable people to navigate the digital world safely
- Democratic accountability, proportionality and evidence-based approach
The Committee’s report states:
“Responses to growing public concern have been piecemeal, whereas they should be continually reviewed as part of a wider strategy. A new framework for regulatory action is needed. We recommend that a new body, which we call the Digital Authority, be established to instruct and coordinate regulators. The Digital Authority would have the remit to continually assess regulation in the digital world and make recommendations on where additional powers are necessary to fill gaps. The Digital Authority would also bring together non- statutory organisations with duties in this area.
“Effective and timely policy-making and legislation relies on decision-makers being fully informed. However, the speed at which the digital world is developing poses a serious challenge. The Digital Authority should play a key role in providing the public, the Government and Parliament with the latest information. To ensure a strong role for Parliament in the regulation of the digital world, the Digital Authority should report to a joint committee of both Houses of Parliament whose remit is to consider all matters related to the digital world.”
A warning about platforms
The Committee also takes note of the impact digital platforms are having on competition and on users. For example, its states:
“Digital markets pose challenges to competition law, including network effects which result in ‘winner-takes-all’, the power of intermediaries, and consumer welfare in the context of ‘free of charge’ services. The largest tech companies can buy start-up companies before they can become competitive. Responses based on competition law struggle to keep pace with digital markets and often take place only once irreversible damage is done. We recommend that the consumer welfare test needs to be broadened and a public interest test should be applied to data-driven mergers.”
In addition, it notes that in the EU illegal content is regulated by the operation of the general law and by the e-Commerce Directive, which exempts online platforms from liability unless they have specific knowledge of illegal content. However, at nearly 20 years old, this law was developed before platforms began to curate content for users. The Committee states the directive is no longer adequate and that platforms aren’t doing enough themselves. The report states:
“Self-regulation by online platforms which host user-generated content, including social media platforms, is failing. Their moderation processes are unacceptably opaque and slow. We recommend that online services which host user-generated content should be subject to a statutory duty of care and that Ofcom should have responsibility for enforcing this duty of care, particularly in respect of children and the vulnerable in society. The duty of care should ensure that providers take account of safety in designing their services to prevent harm. This should include providing appropriate moderation processes to handle complaints about content.”