Here’s a little question : do you trust Marc Benioff?
Or rather, do you trust Marc Benioff’s company Salesforce to handle your personal data?
It’s a question posed by the Salesforce CEO himself at a panel session entitled In Tech We Trust at the World Economic Forum in Davos yesterday.
It’s also a question that could be levelled at two of his fellow panelists, Yahoo! CEO Marissa Mayer and Mike Fries, CEO at cable giant Liberty Global, who were joined by world wide web inventor Sir Tim Berners-Lee and the new European Digital Agenda Commissioner Günther Oettinger.
It was an interesting line-up: 2 US B2C firms, one US enterprise cloud firm, a technology academic and a European politician who’s quite frankly not on top of his brief. As an ensemble it produced some interesting consensus and some equally interesting divisions, particularly around regulation and governance as we’ll see in part 2 of this article.
But first up, trust was the main theme of the session and Benioff kicked off with the stark declaration that:
Trust is a serious problem. You can see that across the world. Everyone has their personal story or has seen a societal or cultural story. The reality is that we have to step up and get to another level of openness and transparency and that’s not necessarily comfortable for everybody, especially the vendors.
What we see is different organisations taking on different characteristics of trust and also different levels of transparency which will ultimately yield where we are going with trust. Only through radical new levels of transparency are we going to get to radical new levels of trust, which is where we have to get to to make this new world really work.
As an enterprise vendor dealing with enterprise customers, Salesforce has an advantage over consumer firms, such as Yahoo!, he argues:
[Our’s] is very much a model of where the consumer companies are going to have to go. The enterprise companies recognise that we can’t do anything without our customer saying OK. In the consumer world, it’s different. Sometimes you know what’s happening, sometimes you don’t.
Benioff illustrated this point by citing his own use of a consumer email service, adding politely that he wasn’t going to name the brand. (As Yahoo!’s CEO was sitting opposite him, I’m guessing it might begin with a G?). He explained:
I don’t know where my email is, I don’t know what country it’s in, I don’t know what laws are regulating it, I don’t even know if the vendor knows where my email is! That’s going to change. You can’t just be searching on the internet, using consumer services, doing various things and you don’t know what’s going on. You’re going to have to have complete and total disclosure.
Whether you’re an enterprise vendor or a consumer vendor we need to all open up a lot more to be able to say exactly where is the data, what’s going on with the data, who has the data and if there’s a problem with the data - whether it’s a security problem or some other issue - there is immediate disclosure and complete and total transparency.
No secrets. Only through that transparency are we going to get to that higher level of trust - and that is not where we are today as we all know.
For her part, Mayer approached the question of trust in terms of trade-offs:
Ultimately trust is about someone weighing trade-offs. How much privacy do I have, how secure do I feel, what are the benefits that I get in exchange for that? You need to have transparency in that world, but you also need to afford to the individual, choice and control.
The users own their data. They should be able to examine it, take it with them, bring it to other sites and other vendors they trust more. We need a system and a market that helps people to make these trade-offs and these decisions, but they should have control over how they use the system or whether they use the system at all.
That’s all fine in theory, but in practice there are barriers, Mayer admitted:
Overall people sometimes have a difficult time making these trade-off. Arguably some of the vendors are not being transparent enough or providing enough controls and choice. As we address that, it will make it much clearer for people in terms of the trade—offs they want to make.
When you look at mature industries, we all tell our governments where we live, what we look like in order to drive, how much money we make and how we make it in order to partake in civil services. There are a lot of areas where people already give up a lot of information about themselves, but they ultimately get a lot of benefit.
For his part, Fries said that his company was deliberately missing out on potentially millions of dollars of revenue by not exploiting big data trends for issues surrounding consumer trust:
We probably have access to 50 billion hours of viewing from our 27 million customers and 30 billion clicks a month and today we do nothing - we generate zero revenue from all that information.
There is a big problem with trust today. We’ve all seen this train wreck coming. Consumers have shared everything about their personal lives on the web. Ninety percent feel like they’ve lost control, 85% have tried to do something to protect themselves, but 60% know that they’re in trouble when it comes to sharing that information.
This is not to say that Liberty Global doesn’t want to move in some of the revenue potential, he added:
Big data is big business for a lot of people. We’re not one of them. We’d like to be. We’re going to find a way to monetise this, but we have some principles. First principle is with consent. We won’t do anything in terms of personalised viewing or using your data for any other purpose unless you approve it. The first question when you log on is ‘Do you want us to use any of your data for personalised viewing?’ Seventy percent say yes and they sure like the fact that we’re asking. It’s not as if we use it first then ask them second. That’s a big difference to social media or other aspects of their internet experience.
He concluded that there is another external factor that complicates matters:
There’s a big disconnect between data protection and data retention, for the purposes of the government. On the one hand you’ve got governments saying that you’ve got to have consumers protected, on the other hand you’ve got to retain that data because we might need it for government or security purposes. There’s a disconnect there that has [consumers] worried.
And that’s where things start to get really interesting…