What can government learn from COVID digital tech? Lots, according to the Ada Lovelace Institute
- Summary:
- Digital tech was rolled out at great speed at the height of the pandemic, to greater or lesser effect. But there are long terms lessons to be learned from the crisis that governments around the world need to take on board.
During his appearance at London Tech Week yesterday, British Prime Minister Rishi Sunak made several references to the work of the Vaccine Taskforce set up by the UK Government during the COVID pandemic, citing its agility and autonomy as a model for the new AI Taskforce set up by the current administration.
That might well prove to be a good move if the findings of a new report from the Ada Lovelace Institute into the use of digital technologies by governments during the pandemic is to be taken at face value. Framing COVID as the first global health crisis of “the algorithmic age”, the Institute urges governments around the world to learn lessons from how digital tech was leveraged to deal with that crisis.
The Ada Lovelace Institute was established by the Nuffield Foundation in early 2018, in collaboration with the Alan Turing Institute, the Royal Society, the British Academy, the Royal Statistical Society, the Wellcome Trust, Luminate, techUK and the Nuffield Council on Bioethics. Its mission is to ensure that data and AI work for people and society.
Tapping into the experiences of 34 national governments, the report - Lessons from the App Store: insights and learnings from COVID-19 technologies - contains recommendations across four cross-cutting themes – effectiveness, public legitimacy, inequalities and governance – and identifies outstanding questions for further research. According to Andrew Strait, Associate Director at the Ada Lovelace Institute:
COVID-19 was the first pandemic of the algorithmic age. New technologies were deployed rapidly, driven by the very real urgency of the situation. However, it is important that policymakers look back and learn the many lessons we have identified in our research.
Speed
The report looks back at how contact tracing apps and digital vaccine passports were rolled out at maximum speed. While these were seen by many as essential in managing an unprecedented public health crisis, they also attracted criticism and concern around issue such as privacy, surveillance, equity and social control. The report notes:
These technologies were rolled out rapidly at a time when countries were under significant pressure from the financial and societal costs of the pandemic. Public healthcare systems struggled to cope with the high numbers of patients, and pandemic restrictions such as lockdowns resulted in severe economic crises and challenges to education, welfare and wellbeing.
Governments and policymakers needed to make decisions and respond urgently, and they turned to new technologies as a tool to help control the spread of infection and support a return to ‘normal life’. This meant that – as well as guiding the development of technologies – they had an interest in convincing the public that they were useful and safe.
With the COVID crisis largely under control in many countries, there are clear lessons that can be learned from the deployment of tech as part of the fightback against the disease - and some of that tech could be repurposed, argues the Institute, noting that a number of countries have already adapted it to use as new health data and digital identity systems.
Questions
But the Institute remains concerned at what it terms as “evidence gaps that indicate where evaluation and learning mechanisms fell short” when digital tech was used in response to COVID-specific needs. So, going back to those four themes of effectiveness, public legitimacy; inequalities; and governance, regulation and accountability, the report asks some hard questions.
For example, on the subject of effectiveness, it asks whether COVID tech worked and concludes:
Contact tracing apps and digital vaccine passports were – necessarily – rolled out quickly, without consideration of what evidence would be needed to demonstrate their effectiveness. There was insufficient consideration and no consensus reached on how to define, monitor, evaluate or demonstrate their effectiveness and impacts…The evidence is inadequate on whether COVID-19 technologies resulted in positive change in people’s health behaviours (for example, whether people self-isolated after receiving an alert from a contact tracing app), either when the technologies were first deployed or over time.
Similarly, it is not clear how the apps’ technical properties and the various policies and approaches impacted on public uptake of the apps or adherence to relevant guidelines (for example, self-isolation after receiving an alert from a contact tracing app).
But did people accept COVID technologies, given that public legitimacy was crucial to their uptake? Conclusion:
The lack of targeted public communications resulted in poor understanding of the purpose and technical properties of COVID-19 technologies. This reduced public acceptance and social consensus around whether and how to use the technologies.
Lessons
All that - and more - being the case, what are the lessons that need to be learned here? The Institute produces a number of key recommendations, including:
- Investing in research and evaluation from the start, and implementing a clear evaluation framework to build evidence during deployment, as well as defining criteria for effectiveness using “a human-centred approach that goes beyond technical efficacy and builds an understanding of people’s experiences”
- More effective communication of the purpose of using technology in public crises, including the technical infrastructure and legislative framework for specific technologies, to address public hesitancy and build social consensus.
- Harmonizing global, national and regional regulatory tools and mechanisms to address global inequalities and tensions and avoid entrenching and exacerbating societal inequalities.
- Creating specific guidelines and laws to ensure technology developers follow privacy-by-design and ethics-by-design principles, and that effective monitoring and evaluation frameworks and sunset mechanisms are in place for the deployment of technologies.
As Melis Mevsimler, Visiting Senior Researcher at the Ada Lovelace Institute, puts it:
Our research provides clear lessons for governments and policymakers deciding how to use similar technologies in future as part “people-centred” approach that goes beyond technical considerations and genuinely improves people’s lived experiences. We must continue investigating the evolution of COVID-19 technologies, so that we can decide on the appropriate role of these technologies in our societies, now and in the future.
My take
While I’m sure we all want to put memories of the COVID crisis behind us, the pandemic wrought changes on society that we’re still processing. Here at diginomica, we’ve written a great about the shift in working patterns and behaviors, a debate that’s still ongoing and will be for some time yet. This latest report from the Ada Lovelace Institute is a weighty insight into another crucially important topic. COVID may be essentially behind us for most of us, but the idea that it’s the last time we’ll have to deal with such a health crisis is ludicrous. There are lessons to be learned that perhaps we just didn’t have time to take on board at the time. But it’s not too late to learn them now. Worth a read.