Back in June as organizations and governments began talking openly about getting people back into workplaces and IT vendors rolled out technologies intended to support this, diginomica noted that a critical factor here would come down to a basic question of trust:
Your employer says it's safe to return, that precautions have been put in place and there's a new health-centric working model at play. But still, are you entirely convinced that the promises of being able to return to the office environment can be done safely?
A few weeks later, ongoing research from Salesforce, which has its own Work.com solution to enable a safe return to the workplace, found that there were still signs of a hefty trust deficit on the part of employees:
According to the firm's latest update to its ongoing general population survey of 3,548 adults from the United States, the UK, France, Germany, Brazil, and Australia, barely over half of those polled appear to trust their employers to maintain a safe and healthy workplace - 56% agree they can trust them, 17% disagree, leaving a sizeable minority needing to be convinced. It's the same story when respondents are asked whether they feel they could trust their employer to respond to an emergency - such as a localized COVID outbreak? - quickly/effectively, with 53% saying they do and 10% saying they don't.
So I was interested to listen in to the latest debate around the topic as part of Salesforce's Leading Through Change online program, kicking off with a highly pertinent point from the firm's Chief Legal Officer Amy Weaver who commented that while tech solutions were looked to as enablers, there were indeed wider questions:
Just as quickly as the idea of these tools caught our imagination, it became clear that even in the midst of an unprecedented health crisis, trust matters. Because no matter how good a tool is people won't use it, unless they trust it. The debate between privacy and safety, between convenience and potential bias, is playing out in real time around the world, from certain countries using geo-location services to track people exposed to the disease, to the controversial use of algorithms to help create a level test score results in the United Kingdom.
These issues are no longer just lofty and somewhat esoteric legal concepts. The privacy and civil rights questions that are raised are novel, cutting edge, and most of all, they are deeply, deeply personal. And they present an incredible opportunity to show that this does not need to be a zero sum game, that it's possible to build and maintain trust, while exploring new technologies to fight the pandemic.
Jules Polonetsky, CEO of the Future of Privacy Forum, a non-profit that focuses on privacy leadership and scholarship, concurred, observing:
I think we saw some people maybe plunging in very quickly, thinking that tech would solve the problem and Silicon Valley was going to invent solutions. It wasn't surprising to see a little bit of a backlash. This isn't about technology, this is about communicating with the public. This is about understanding what do the public health authorities need, what do epidemiologists need? Now that we've had a little bit more experience, you're starting to see a more careful nuanced process. The leading platforms are supporting apps that don't require location [data]. Do I really need your location? I just need to know that someone who got a diagnosis was near other people. By using cryptography, by using Bluetooth to detect other devices, I think we've settled in large part on a system where people who download these apps can be comfortable that they're going to get an alert and they'll know that they were in proximity to somebody else. But it took a while to get there, and there's some lessons from some of the snafus.
Chief among the snafus is an underlying reality, he suggested, one that's not always been appreciated pre-COVID:
I think the challenge is understanding that around the world that privacy and data is a human right. Human rights doesn't simply mean you can clear your cookies or make sure you've opted in before you send somebody an email. Human rights means, this is really something about yourself and your dignity and your ability to be a free person. I'm not sure a couple years ago that this was fully accepted in a lot of the business world…But I think today...this really is about everyone's rights as a human.
To that end, the question needs to be about proportionality, he added:
We saw apps that were collecting location [data] and maybe didn't really need that location. The principle is do you really need this, because if you collect it and you don't need it, we're going to get worried. We see college students who are being asked to sign up for apps that will monitor where they are and are they going off campus and you think to yourself, [this is] a little bit heavy handed. None of us want to be surveilled. We want to understand what's going on. We want to understand that if data is being collected, it is indeed the minimum. And most important, we want to recognize that it's not going to be used in a way that's adverse to us. We want to understand that we are in control, and that we, the community, the school is bought in to the way the data is being used.
Earn my trust
Trust needs to be earned and built, said Victoria Espinel, CEO of BSA, the Business Software Alliance, citing three aspects of this process:
First is being straightforward and transparent about how and why decisions are made. That doesn't necessarily mean everyone's gonna be happy with every decision, but they should understand how and why you got there. The second is listening to concerns, really listening to them, really trying to understand them and if you can address them. The third is to stick to what you say…Generally and perhaps even more so at this time of COVID, people need to know that their data is going to be protected and kept safe.
That's why the BSA and Salesforce have both put out privacy principles that are very specific to this time of COVID. Some of those are really common sense, straightforward things like, try not to collect personal identifying information, be thoughtful about how you minimize the amount of data that you collect, put time limits on how long you keep it. But a big part of that is transparency. That means telling people that their data is being collected, telling them why it's being collected and how it's going to be used - and then to stick to what you say. The guiding principle for all of this has to be, no surprises.
One thing that would help with bolstering trust levels in the US would be a Federal level data protection law to replace the current mish-mash of ‘zip code regimes'. This has been a topic of fierce debate for years, fuelled of late by the success of the European Union's GDPR, but progress towards a country-wide solution in the US has been slow. It's a major issue, agreed Polonetsky:
We know if we go to the doctor that health information is protected because we have a health privacy law, but we don't have protection for that information if the same data about my health is provided, maybe to an employer or to somewhere else. We're one of the only democracies in the world and we will soon maybe be the only major economy in the world that doesn't have a comprehensive privacy law.
More than ever we need data moving. We need it moving around the world, because if we're going to have clinical trials and we want to make sure vaccines work, we need to understand that it works with different kinds of populations. So we've become more dependent than ever because of COVID and we end up having this gap of some laws cover some data, other sorts cover other data. I'm hoping that we'll see Congress, despite everything else that's going on, prioritize putting in place comprehensive privacy laws, so people can know that if they share data, companies, governments, organizations are bound not only by good policies and good practices, but there's a law backing it. It'll accelerate, trust if we can promise people that they're protected by law.
Espinel is understandably sceptical of any immediate progress on this front, but looks beyond November's Presidential election for a fresh push:
The pandemic has made people realize how necessary data is in order to solve major societal challenges, in this case, trying to address COVID-19, trying to find a cure or vaccine for the disease itself, but also trying to ameliorate all the negative economic impacts of COVID-19. We can't do that without data and that data has to be protected. It makes no sense to not have to have a system of data protection in the United States, where the level of protection is completely dependent upon where in the country you happen to be located. I think it is entirely possible that this will help underscore the importance to Congress of why it is necessary for the United States to have a comprehensive high standard enforceable. privacy law across the United States. In 2021 as the new Congress starts, I'm optimistic that we'll be able to make the case for that.
An excellent discussion that covered a lot of important topics. The ongoing lack of Federal data protection in the US is rightly identified as an enormous inhibitor to trust levels and if there is to be any silver lining from the COVID cloud, progress on rectifying that would be some kind of progress. In the meantime, city centers around the world remain unnervingly empty and the wholesale shift back to the office regime hasn't occurred despite exhortations from politicians and some business leaders. This question of trust isn't going to be a simple one to answer.