Suppression vs surveillance - is it healthy if COVID-19 leads the public to compromise on privacy for a greater good?

Profile picture for user slauchlan By Stuart Lauchlan October 8, 2020
Summary:
Track-and-trace apps are being rolled out in the name of suppressing and managing COVID-19, but there remains the potential for any surveillance tech to be abused. Is the public ready to compromise on privacy for the greater good?

spying
(Pixabay)

Are we living in a brave new world where governments can justify the use of surveillance technology for the greater good, in this case track-and-trace apps for COVID-19 management? And at what point does there need to be new government protocols that shape what new role private sector companies are playing in the creation and operation of such systems? 

Two highly relevant questions for the pandemic age as governments around the world roll out, with varying degrees of success, national COVID tracking systems. The first is one that’s set off privacy alarm bells from the first mention of track-and-trace, while the latter’s importance is elevated by the dependency of the state on third party tech providers. 

The two topics formed the spine of an interesting debate at the recent GovTech Summit 2020 as a number of expert speakers sought answers and reassurance on the theme of how COVID-19 has changed public attitudes to surveillance. 

Debate moderator Ria Thomas, Managing Director of consultancy Polynia Advisory had an illustrative example to offer when she recalled a recent trip to a local restaurant in the UK: 

I was really amazed to see people who were walking in, being asked by the restaurant to scan the NHS QR code and how many of them didn't have the app downloaded, but didn't hesitate at all to download the app then and there so that they could scan the QR code and then come into the restaurant. 

What this prompted in Thomas’s mind was the question of whether the public perception of being in a state of emergency at a time of national need means that people are more willing to be open to engaging with tech that could also be used for surveillance?

Context matters

Sir Mark Rowley, Executive Chairman, Hagalaz, a consultancy specialising in crisis leadership to advise both specialist police/military contexts and wider commercial sectors, actually takes the view that it’s easier to build trust in ‘peacetime’ than in ‘wartime’. The public has a tolerance level around surveillance and the various factors that 'legitimise’ it in society, eg people are comfortable with the idea of surveillance on terrorists or pedophiles, but wouldn’t be if the same tech was used to monitor shoplifters. Context is everything. 

What’s different with the current crisis is that surveillance is being introduced into healthcare in a manner that hasn’t been seen before, he argued: 

What we're trying to do now, which is a wicked and difficult problem for government, is to start to deploy surveillance for health purposes, in a way that is completely beyond anything has been done before. And you're trying to do it for the first time in the middle of a crisis where many, many thousands of people are dying. That is a really difficult ask for anybody and it's not surprising that the debate and the public opinion is going to be bumpy. What we’ve got to try and do is to take those parallels and those lessons [from other forms of surveillance] and think about, how do you rapidly try and build that trust, build an understanding both about the threat and the methods used, and all the safeguards around it?…People sort of get that it's necessary, but they're  bound to be nervous and want to understand the detail. 

The tech and the tools used need to be proportionate to the threat level, he suggested, if the public is to be convinced to use it: 

If this is a singular threat, then the tools and tactics need to be singular to that. I guess we might want them on the shelf for future threats if pandemics are going to happen from time to time as the experts seem to say, but it might not make sense to have them continue to be deployed, as opposed to being readied in the wings or deployed in a crisis. I think that that goes with public trust - understanding what's the threat, what are the reasonable tools and tactics that the state can use and what are the safeguards and systems around it that will mean that it's done properly and fairly?

Oversight

And there needs to be proper regulatory checks and balances in place of course, which is where bodies like the UK Information Commissioner’s Office comes in. The current commissioner Elizabeth Denham is keen to emphasise the active role that her organization has taken in the development of the UK’s NHS track-and-trace app:

We worked alongside of the Department of Health and Social Care when they were developing the app. We don't approve, we don't give a pass to any technology, but we do act as a critical friend. And in this case, we did. We were pleased with the development of the app because it built privacy in - anonymous data, voluntary use of the app, as well as control over re-use or function creep by preventing a back end database. 

What has changed as the crisis has rolled on is a realization that there are longer term implications involved than might first have been realised. Denham explained:  

At the very beginning, when our office and others were looking at the privacy implications of the test-and-trace system of the application, we were focused on how will it be decommissioned at the end of the day? That was a real focus. I think now we know the data and digital is going to be a foundation that's used to mitigate the risks of the pandemic as time goes on, so we know some of the use of these technologies will be with us for some time. 

Another issue that arose from Thomas’s restaurant experience was what is the role and responsibility of businesses and service providers who are now required to collect health data. Thomas didn’t have her phone on her that evening, so the restaurant staff had to physically take her contact information down on a piece of paper. Such an action, which remains enormously common practice despite the release of the NHS app, begs a number of questions about privacy and data security. As Denham observed: 

I would say that the security of the data that's collected in the QR code is more secure than the information about you and your contacts written down on a piece of paper. We know that the businesses don't actually have that data [that’s collected in the app]; they have the ability through the system to notify you if there is some kind of an event that happens at the restaurant. 

But businesses, such as pubs, restaurants, cafes etc, do have to be transparent with their customers, she added: 

They have the responsibility to secure the data, but they also have the responsibility to collect limited amounts of information for the purpose. Those responsibilities have not changed. What has changed in the context of COVID is the public/private partnerships in order to mitigate the risks of the pandemic. I think, for example, of supermarkets having to collect data on vulnerable individuals that were shielding, so that they could actually provide food to them. That's government data in a sense going to a business in order for the business to be able to provide the service. Again, our office was very was involved in that, to give advice to make sure the least amount of information was collected for the purpose, and that the data was deleted at the end of the programme. All of those same principles and those same provisions are there. 

Tech responsibility

Then there are the tech businesses that are actually providing the apps. Denham is in no doubt about what’s demanded of them: 

They have the responsibility of privacy-by-design.

It’s a point echoed by Nima Elmi, Head of Government Affairs at the World Economic Forum, who emphasised the need to think beyond national boundaries: 

The reality is 96% of European nations have data protection and data privacy laws, but when you look at the global landscape, only 43% of the less-developed nations have similar regulations in place. Due to the borderless nature of technology, it does mean it's even more crucial that we can adopt fundamental principles that should  frame the purposes by which technologies can collect, use, monetize and disseminate our data. Without this, there is the potential to exploit the most vulnerable communities and the existing digital divide that many speak of. I do think that there is an important basis for us to be able to think through if the principles that we're advocating for from a European perspective, are just as applicable around the world, or can we distil those principles down so that we get to the minimum global standards.

There needs to be multi-stakeholder participation here, she argued: 

[Trust] is a really, really important issue because, ultimately, where we see that there is misinformation, it erodes trust. It's important for corporates to be really as transparent about their privacy-by-design. positions or focus on human centred design as it is government’s…What we're seeing is that businesses are at the forefront of technology innovation, but at the same time they need to be able to foster trust by being transparent. 

In countries where they have deployed contact tracing apps, like Ireland and Germany, and shared the source code before they did the launch and were very good at informing the general public about the parameters of the capabilities, particularly when it comes to surveillance, they have had a greater uptake. Because they are being very transparent at the outset and saying, 'Look, we're not trying to do some backdoor activities where we're tracking you and monitoring you'. So I do think it is important when apps are developed and governments are then leveraging those third party apps, that they're able to be super transparent and foster that trust in the same way. It's a two way street. 

The corporates who are developing these apps also have an ethos themselves and I think this is particularly important with the Google, Apple API which has had a lot of back and forth. A company like Apple, that has been very vocal about privacy by design… that sets a barometer for governments if they are working with those companies. It also gives the corporates the power to be able to visit and check, where they've agreed something at the outset and then the governments change their position later on, for those companies to be able to come back to this.

My take

Public trust and confidence is critical in tackling crises such COVID-19. As the months have gone on, we’re seen plenty of examples of how that public trust can be eroded by political shortcomings - see Hancock, Matt in the UK for an ongoing prime example - to negative effect. I’ve been using the NHS app and regard it as a necessary ‘evil’ if I want to be able to go about some form of normal life at present. And like Thomas, I’ve noticed how willing others appear to be to use it, which is a hopeful sign. 

That said, trust is fragile and all it will take is one data breach or abuse to ruin everything. With that in mind, it is of course 'encouraging' in the UK to know that  Dido Harding, the woman in charge of track-and-trace, is the same person who was in charge when TalkTalk exposed the data of 157,000 of its customers. In that context, this week’s spreadsheet (Han)cock-up, when test results of 16,000 people didn’t get processed, almost looks like Harding is on a taxpayer-funded learning curve to improvement! Ahem...

But the important point here is any privacy concerns I have are less around planned state surveillance and more about the rewarding of the inadequate and the inept as they rise without trace - pardon the pun! - to positions of phenomenal responsibility managing my data. That’s where this will go wrong. 

Final word to the ICO’s Denham on why all this really matters: 

If we lose public trust in this pandemic or any other situation, we know that there will be a hit on the public's uptake and participation in these important services and these important provisions. So, public trust is fundamental and privacy is a big part of that.