Sleepwalking to China - COVID-19 response risks surveillance state result, warns UK biometrics commissioner
- Summary:
- The fight against COVID-19 could have longer term consequences of an unwelcome nature unless there's more transparency.
As the UK begins road-testing a centralised contact tracing app to help tackle COVID-19 – rather than the decentralised solution being adopted in other parts of the world – might the UK sleepwalk towards becoming a biometric surveillance state like China?
It sounds like a question asked by dystopian alarmists and conspiracy theorists, but the risk of this happening has been aired by none other than the UK’s Commissioner for the Retention and Use of Biometric Material, Professor Paul Wiles.
At the end of January, Wiles completed a report for the UK Home Office on the use of facial recognition and other biometric technologies by the police service, just as news was breaking of the seriousness of China’s virus outbreak.
Yesterday – in what may be his final speech as Commissioner before he steps down at the end of June – Wiles told online delegates at a Westminster eForum conference on digital identity that the two issues risk becoming linked during the pandemic:
I concluded my discussion of the technologies with a point that, at the time, I was cautious to make: that decisions about the use by governments of artificial intelligence and biometrics involve a politically strategic choice about what kind of future social and political world we want to create.
It was clear to me that these new technologies are going to lead to a new social and political framing of the world we live in. And they may even be the basis for rebuilding our economy, if it emerges badly damaged from the pandemic.
Chinese example
Referring to China’s national social credit system, which became compulsory this year, Wiles said:
I pointed to the example of China, as a country that has already made its strategic political choice and is actively seeking a technological lead in this area as a basis to develop global power and influence – a new and previously unattainable form of social control over the lives and thinking of its citizens.
My caution at the time [of writing the report] was because I knew I could be accused of shroud waving and inviting the response that, ‘We are not China and we would not make the same choices’. But that misses the point, which is that if we do not want to create a future in the image of China, then we urgently need to decide what kind of future we do want instead.
I feel less cautious now in making these points, because the pandemic has underlined them. China has demonstrated how effective such technical means of control can be.
Concerns have been raised that biometric technologies should not be deployed without proper consideration of their operating model, and whether that sustains our civil liberties, said Wiles. There is also ongoing debate about how to distinguish what uses of the technology might be acceptable in an emergency – and once that emergency has abated:
My office will have to make a report on the consequences and changes before Parliament considers any extension. But in reality, our current uses of the new technologies emerge out of a series of separate pragmatic decisions by government in partnership with the commercial sector….In examples of major technical transformations, laissez-faire pragmatism has quickly had to be governed by public decision-making.
The message here is that legisators need to step in quickly during the recovery to ensure that rigorous public debate, transparency, and regulation takes the place of any decisions that have been made ‘on the hoof’, in terms of central surveillance of the population and of individuals. That’s particularly important if citizens find themselves excluded from services via some form of coronavirus passport system.
But…
But there’s a problem. According to Wiles, his report stressed that new technologies are developing at a speed that politics, government, and legislators have “simply not kept up with” – though that is hardly a new problem:
It has been knocked back even further by the current emergency. Even though not all of government is dealing directly COVID-19, it seems to have paralysed other thinking, just as the Brexit debate did for years before.
Beyond this national paralysis, the wider context is that the UK has failed to develop proper, rigorous, standardised methods for trialling and evaluating new technologies before it deploys them at scale:
There is a danger that instead of choosing how to deploy new technologies based on evidence, we fall under the spell of technical wizardry that claims to provide easy solutions to problems.
Public trials methodology is well embedded into science and its governance, but not in many other areas. Each other area of application – for example, policing – needs a standard trials methodology. Without that, we run the risk of deploying technologies that have unforeseen or harmful effects, or we fail to develop the necessary decision-making framework.
We have to address claims made by technology developers in good faith. The point is not hostility to developers or to dampen technical development, but to extend the development process into real-world applications with the same rigor.
Wiles argued that these problems should be resolved by always applying a public interest test, rather than solely measuring partial (eg: police) or commercial interests. He continued:
Biometrics – at any rate, the ones the police are most interested in – depend on analytics that often use data about individuals. And sometimes reveal highly personal aspects of our biology or social behaviour. That means that some uses of new technology will intrude into individual rights, including the right to privacy.
Intrusions into individual rights have to be balanced against the wider public interest: what lawyers usually refer to as ‘proportionality’. Such an approach is well established in policymaking and in public law, at least in western democracy. And it is reflected in transnational governance, such as the European Convention of Human Rights.
That is why the decisions we make now in an emergency must be revisited once it is over, and not be allowed to drift into a different context.
My take
Public interest tests cannot be made by partial or vested interests on citizens’ behalf. But public sentiment is not read best in opinion surveys, said Wiles, but instead needs to be informed by deep, transparent, open debate. Such a debate should be instigated and led by a minister in a cross-party environment, but to date this has been “largely missing”.
Greater openness and transparency will be sorely needed in the months ahead. Failure to provide them could mean biometric technology being deployed in this government’s interests, and not the public’s. That would be a dereliction of duty and a failure of public service.