World Economic Forum 2018 - why trust has to be valued higher than growth in the 4IR

Stuart Lauchlan Profile picture for user slauchlan January 22, 2018
If anything is more important than trust in your corporate values heirarchy, then you're in trouble. But what is trust? Important debate at the World Economic Forum.

There’s a trust shift underway in the Fourth Industrial Revolution (4IR) and that demands both new CEO models of behaviour as well as increased regulation of the tech sector.

Against a macro-political backdrop where 'Fake News' is anything you disagree with and 'alternative truth' can be pitched as a serious political response, the World Economic Forum kicked off today with a debate around the importance of trust, specifically in relation to the technology industry.

It was a wide-ranging conversation among industry leaders and commentators that inevitably threw up more questions than it did answers, but which exposed some difficult circles to be squared as the 4IR takes a hold and the associated technologies become more sophisticated and widely-adopted.

The backdrop was set eloquently by Rachel Botsman, Lecturer at the Said Business School at the University of Oxford, who argued that technological innovation in fact accelerates rather than diminishes trust, as convenience takes precedence over wider concerns:

We’re living in an age of trust on speed. So every time we swipe right on Tinder or we accept the [Uber] driver, in many ways convenience is trumping trust. I find that interesting because trust needs a bit of friction. It needs us to slow down and say, ‘Is this person or this piece of technology or this piece of content, is it actually worthy of our trust?’.

The challenge that we have is that technology wants to automate the process and make it more and more efficient. On the one hand as consumers we want convenience and we want that speed. At the same time, companies have a responsibility to build the security and the safety and the trust into the technology, which means building a little bit of friction in to slow down.

This leads to an interesting tension, she added:

We want the empowerment, we want the agency. At the same time we want the platform to take a lot of responsibility for reducing risk and taking responsibility when things go wrong. I don’t think we can have both. In the first six years, whether it be Facebook or Uber, the platforms are allowed to create a lot of value in just matching supply and demand. What’s now happening with regulation is people saying that you can’t just create value from supply and demand and not be responsible for what’s happening on the platform. You can’t have it both ways.

Whose version of truth?

This raised the ongoing question of how digital platforms should be defined - and by extension, regulated. Sir Martin Sorrell, CEO of global advertising giant WPP, picked up on this:

The issue is whether Facebook and Google are technology companies or media companies, f they acknowledge, which they have not to date, philsophically that they are media companies. In practice they are starting to. Facebook and Google are hiring people to monitor their editorial content, because the big issue is can we trust all the content that is on these plaforms?

Botsman added an important proviso to this argument when she posed the question about where responsibility should lie:

We have to ask ourselves what we want. Do we want Facebook to be the arbiter of truth? Do we want a group of regulators to decide what news is trustworthy? Or do we want we the users to decide?

The problem with calling for more regulation is that the established structures and models are increasingly no longer relevant or fit for purpose, she added:

Trust [used to flow] in a top-down, heirarchical fashion with a very linear relationship between the company and the consumer. What’s happening is that’s being blown-up in many ways. We have a distributed form of trust that now flows sideways, directly to individuals. That’s a real challenge for regulators.

A lot of regulation is designed for traditional institutions, designed to think in this heirarchical, top-down way. When you talk to regulators about how you regulate platforms, how do you break up these network monopolies, the way regulators think is, ‘Where is the center of accountability?’. That’s a real problem when you have millions of users who are essentially the product.

Regulate me!

Nonetheless there’s a need for more regulation, countered Salesforce CEO Marc Benioff, something that might have struck others on the panel as an unusual request. Google, which has faced multiple regulatory inquiries, was represented by Alphabet CFO Ruth Porat, while Uber’s new CEO Dara Khosrowshahi, also present, has battles to fight on a worldwide front just to get permission to operate.

But Benioff insisted that the time was ripe for the technology industry to be subject to more regulatory controls in order to curb excesses, similar to, for example, the financial services sector of ten years ago:

That is the point of regulators and government - to come in and point true north. In the tech indsutry we’ve neen pretty clear of those regulators pretty much the whole life span. But we’re seeing signs, especially this year., especially when you see what happened at the elections, what happens with social networks, and especially when you see CEOs who fully abdicate their responsibilities and say, ‘I had no idea that was happening, I’m so sorry”.

The signs are pointing to more regulation. If CEOs won’t take responsibilty, then I think you have no choice but for the government to come in. Some of this technology is so powerful and so sophisticated and so deep and multi-dimensional, that even these companies don’t understand how it’s being used in nefarious ways. But when the CEO comes out on stage and says, ‘Oh gee, there’s no way that could happen’ and then two weeks later it’s ‘Oh yes, there’s a little bit there’. And then there’s another reveal and another reveal.

That’s a message that could have been aimed at Khosrowshahi’s predecessor as Uber CEO and Benioff was scathing in his assessment of the unfolding issues at that firm in the wake of the whistleblower revelations by Susan Fowler, which effectively brought about ‘regime change’ internally at the company:

It became a crisis of trust because she said there was no trust in the culture. It was a culture only built for speed and growth at any cost and that the CEO was setting that tone from the top. As that frame was set by her and amplified by many people in Silicon Valley, then more evidence started to build, even to the point where there was the movement among the user community which was ‘Delete the Uber app’.

Trust has to be the number one priority for any CEO in the 4IR, insisted Benioff:

If it’s not, something bad is going to happen to you. What is the most important thing to you in your company - is it trust or is it growth? If anything trumps trust, you’re in trouble. What is the most important thing to you? Really look at your value system. What’s number one? If everything is important, nothing is important. You have to choose what is really important to you. We’re in a new world and trust better be number one.

For his part, Khosrowshahi, while refraining from direct commentary on his predecessor, acknowledged the importance of trust between company and user, pitching this as something that regulators could promote and enforce:

My ask of regulators would be to be harder in their ask of accountabilty so that a CEO knows that he or she’s job is to know what’s going on at the company. And that if they don’t and they get caught, then they’re out!

It’s totally about trust, agreed Botsman, adding another important caveat - the oft-pitched aspiration of greater corporate transparency isn’t the same thing as trust. She declared:

You’ve actually given up on trust if there’s a need for things to be transparent! I don’t understand the call for organisations to be transparent. What we need to believe is that they’re trustworthy. The call should not be for more transparency. It should be that we understand the intentions of these companies and that we believe that the intentions of these companies are compatible with ours as individuals and ours as society.

My take

An important conversation at a difficult time. The ongoing controversy around potential Russian interference in the US Presidential elections last year is a high-profile case in point about the potential for negative influence of the sort of technologies that will define the 4IR, technologies that will also raise more and more difficult questions about the kind of society we want to live in.

The case for greater regulation seems to me an obvious one in theory, but I have concerns about how we put it into practice. As Botsman points out, the current mechanisms aren’t necessarily appropriate in the fast-changing, tech-enabled world in which we live.

That doesn’t mean we shouldn’t be gearing up to tackle the problem. But I am concerned that we don’t, as yet, have enough tech-savvy policymakers and regulators to make the best decisions. I still cringe every time a government minister steps up to the microphone to demonstrate their tech ignorance by demanding pointlessly that ‘something must be done’ about social media platforms or end-to-end encryption. On this front, the tech industry must step up and help educate the lawmakers.

There is also the question of whose judgment do we count upon. Personally, I don’t want a society in which ‘truth’ is determined by Sarah Huckabee Sanders or by whatever talking points are currently being aired on Fox and Friends. I don't trust a word either of them say. But then again, there are those for whom their output will be gospel truth. There’s a need for individuals in society to assume personal responsibility for filtering out the noise and finding their own truth as well as looking to regulators to set standards. The 4IR is a revolution in which everyone must participate in order for us all to benefit.

A grey colored placeholder image