The British government has been warned that the UK is currently suffering a ‘pandemic of misinformation' that if allowed to flourish will result in the collapse of public trust and see democracy "decline into irrelevance".
That's the view of the Lords Committee on Democracy and Digital Technologies, which has urged the government to act and introduce new powers to inform citizens, boost accountability and transparency, as well as educate the nation.
Chair of the Committee, Lord Puttnam, states that the situation is "serious" and that in the digital world, our belief in what we see, hear and read is being distorted to the point at which we no longer know who or what to trust. And as such, the prospects for building a sustainable society on that basis are non-existent.
The report released today takes aim at Facebook and Google, which it states pass under the radar and are operating outside the rules that govern electoral politics.
However, the Committee notes that online platforms are not inherently ungovernable and should be bound by the same restraints that apply to the rest of society. The hope is that if this is done well, technology can become a "servant of democracy rather than it's enemy".
The Committee urges that the government introduce its Online Harms legislation within a year (rather than by 2024, which is the current timeline), in order to introduce new powers and tackle these challenges to democracy.
Lord Puttnam said:
We are living through a time in which trust is collapsing. People no longer have faith that they can rely on the information they receive or believe what they are told. That is absolutely corrosive for democracy.
Part of the reason for the decline in trust is the unchecked power of digital platforms.
These international behemoths exercise great power without any matching accountability, often denying responsibility for the harm some of the content they host can cause, while continuing to profit from it.
We have set out a programme for change that, taken as a whole, can allow our democratic institutions to wrestle power back from unaccountable corporations and begin the slow process of restoring trust. Technology is not a force of nature and can be harnessed for the public good. The time to do so is now.
The report adds that platforms like Facebook and Google are hiding behind black box algorithms, which choose what content users are shown. The Committee states that it is "plain wrong" that these platforms take the position that their decisions are not responsible for harms that may result from online activity.
The Committee's report makes a number of wide ranging recommendations that aim to strengthen democracy and introduce a level of governance and control over online information. There are 45 recommendations in total, which can be found here, but some of the key ones include:
Ofcom, the UK's communications regulator, should be given the powers to undertake periodic audits of the algorithmic recommending systems used by technology platforms, including access to the training data used to train the systems. The Committee recommends that regulators need access to "all data from these platforms" and that the details of what data is needed will change as technology develops, so powers must be suitably broad.
Experts should cooperate through a regulator or committee on political advertising to develop a code of practice for political advertising, along with appropriate sanctions, that restricts fundamentally inaccurate advertising during elections or referendums.
Ofcom should create a code of practice on misinformation. If a piece of content is identified as misinformation then it should be flagged as such across all platforms.
The government should empower Ofcom to sanction platforms that fail to comply with their duty of care in the Online Harms Bill. These sanctions should include fines of up to 4% of global turnover and powers to enforce ISP blocking of serially non-compliant platforms.
An independent ombudsman for content moderation decisions should be created, so that the public can appeal should they feel let down by a platform's decisions.
The government should facilitate a large scale programme to evaluate digital media literacy initiatives. The Department for Education should also review the school curriculum to ensure pupils are equipped with the skills needed in a modern digital world. Critical digital media literacy should be embedded across the wider curriculum based on the lessons learned from the review.
Ofcom should require large platforms to user test all major design changes to ensure that they increase rather than decrease informed user choices.
Lord Puttnam said:
Our Report addresses a number of concerns, including the urgent case for reform of electoral law and our overwhelming need to become a digitally literate society. We must all become better equipped to understand the means by which we can be exploited, and the motives of those doing so. Misinformation can pervert common sense to the point at which it is easy to forget the fragile foundations upon which so many of our freedoms are built - until they become threatened.
With so many of those freedoms curtailed in lockdown it seems possible that as we regain them, we may wish to contribute more fully towards reimagining and reshaping our future.
I think it would be hard for anyone to deny that misinformation online, and the control of information by a few select platforms, is having a direct impact on trust. Even those who are savvy to misinformation online can struggle with the overwhelming surge of deliberately false content. Up until recently the view of the platforms has been to place responsibility on users to be clued up. But that's not working given that people continue to be influenced, views are hardened at the hand of targeted lies and sensible debate diminishes. Democracy is fundamental to the success of a functioning society and we shouldn't wait until that has been completely broken to try and rebuild it. The government should be looking at these recommendations very seriously and acting with urgency.