Mark Zuckerberg, Facebook CEO, has snubbed calls from the UK government to appear himself before MPs to give evidence, following revelations that political consultancy firm Cambridge Analytica allegedly acquired user data from the social network to influence elections.
Zuckerberg has instead opted to send either Facebook CTO Mike Schroepfer or Chief Product Officer Chris Cox.
In a letter to Damian Collins MP, chair of the UK’s Digital, Culture, Media and Sport (DCMS) Select Committee, head of public policy for Facebook UK, Rebecca Stimson, said:
Facebook fully recognizes the level of public and Parliamentary interest in these issues and support your belief that these issues must be addressed at the most senior levels of the company by those in an authoritative position to answer your questions.
As such Mr Zuckerberg has personally asked one of his deputies to make themselves available to give evidence to the Committee in person.
Both Chris Cox and Mike Schroepfer report directly to Mr Zuckerberg and are among the longest serving senior representatives in Facebook’s 15 year history. Both of them have extensive expertise in these issues and are well placed to answer the Committee’s questions on these complex subjects.
The letter states that one of the selected representatives can be made available to the Committee straight after the Easter Parliamentary recess. Chair of the Committee Damian Collins had written to Facebook last week after the scandal broke, stating that he hoped Zuckerberg would appear himself, given the “catastrophic failure of process”.
During a Committee evidence session today, Collins urged Zuckerberg to rethink his decision. He said:
I think that it's absolutely astonishing that Mark Zuckerberg is not prepared to submit himself to questioning in front of a Parliamentary of Congressional hearing, given that these are questions of fundamental importance and concern to Facebook users...and to our inquiry as well.
I would certainly urge him to think again if he has any care for people who use his company's services.
Cambridge Analytica and Facebook are at the centre of a scandal over how user data is used by companies, after reports have emerged that the Cambridge Analytica allegedly used Facebook data to target voters and influence the outcome of the recent US election.
Both Cambridge Analytica and Facebook recently told legislators in the UK that Facebook data had not been used in this way and that Cambridge Analytica did not hold such information.
Cambridge Analytica is also facing a search of its offices and servers by the UK’s independent data protection regulator, the Information Commissioner’s Office (ICO).
Two reports out in the past few days in the UK – one by the Observer and the other by Channel 4 News – have highlighted how Cambridge Analytica and its executive team mine Facebook data to influence voter behaviour, by creating highly targeted, emotive campaigns on social media.
It’s a complex story, but Channel 4 News secretly filmed Cambridge Analytica’s since suspended chief executive, Alexander Nix, as saying that the company does a lot more than “deep digging” and added that one way to target an individual was to “offer them a deal that’s too good to be true and make sure that’s video recorded”. He also made reference to being able to “send some girls around to the candidate’s house”.
Christopher Wylie, who worked with the company, told the Observer that it amassed the data of millions of people through a personality quiz on Facebook called This is Your Digital Life that was created by an academic (Aleksandr Kogan).
Secretary of State for DCMS recently told MPs that the ICO is considering how political parties and campaigns, data analytics companies and social media platforms in the UK have used people’s personal information to micro-target voters. As part of the investigation, the commissioner is looking at whether Facebook data was acquired and used illegally.
Tom Watson, deputy leader of the Labour Party and shadow culture secretary, tweeted his dismay at Zuckerberg choosing not to appear in person. He said:
After listening to Christopher Wylie’s devastating testimony to the DCMS Select Committee it’s probably worth saying that this isn’t just cowardly, it’s completely unacceptable. https://t.co/P2N5C8KFqI
— Tom Watson (@tom_watson) March 27, 2018
The EU response
As well as facing pressure from UK regulators, the European Commission is also pressing Facebook to answer questions over whether EU citizens’ data were among those improperly harvested by Cambridge Analytica.
EU Justice Commissioner Vera Jourova wrote a letter to Facebook Chief Operating Officer Sheryl Sandberg, which stated:
Have any data of EU citizens been affected by the recent scandal? If this is the case, how do you intend to inform the authorities and users about it?
She added that the statements made my Facebook executives since the scandal broke had not alleviated her concerns. Jourova added:
This is particularly disappointing given our efforts to build a relationship based on trust with you and your colleagues ... this trust is now diminished.
Jourova said that she expects a reply to her letter within two weeks.
In a recent post on his Facebook page, Zuckerberg attempted to address the concerns over the furore by citing examples of what the company is doing to restrict access to user data, claiming that much of the problems occurred prior to the social network implementing stricter controls in 2014. He said:
First, we will investigate all apps that had access to large amounts of information before we changed our platform to dramatically reduce data access in 2014, and we will conduct a full audit of any app with suspicious activity. We will ban any developer from our platform that does not agree to a thorough audit. And if we find developers that misused personally identifiable information, we will ban them and tell everyone affected by those apps. That includes people whose data Kogan misused here as well.
Second, we will restrict developers' data access even further to prevent other kinds of abuse. For example, we will remove developers' access to your data if you haven't used their app in 3 months. We will reduce the data you give an app when you sign in -- to only your name, profile photo, and email address. We'll require developers to not only get approval but also sign a contract in order to ask anyone for access to their posts or other private data. And we'll have more changes to share in the next few days.
Third, we want to make sure you understand which apps you've allowed to access your data. In the next month, we will show everyone a tool at the top of your News Feed with the apps you've used and an easy way to revoke those apps' permissions to your data. We already have a tool to do this in your privacy settings, and now we will put this tool at the top of your News Feed to make sure everyone sees it.
Zuckerberg’s response to the scandal is broadly been criticised, as it was seen as an attempt at damage control, rather than an abject apology for mistakes that have been made. The company has lost over $100 billion in market value since the story broke and Zuckerberg is all too aware that this could lead to legislators implementing stricter regulations and controls over the social media giant’s activities.
There’s a growing sense that users are uncomfortable with the lack of insight into how their data is being used and that Facebook isn’t keeping up its end of the bargain to protect its users. Whilst many will be aware that for a free product like Facebook, you the user are the product - there’s a growing consensus that perhaps things have gone too far.
However, the vitriol around the debate needs to lead to sensible debate about what controls and safeguards can be put in place to limit scandals of a similar nature in future. It’s not that hard, really. Users don’t mind handing over their data if they know exactly what that data is being used for. And they want control in insight into how that data is being used. If I don’t want my data being used in a certain way, I want the control to stop that. Regulations need to ensure that companies such as Facebook play by the rules of what is expected of a modern, digital company. And if they don’t, Facebook can expect users to flock elsewhere.