Facebook’s mea culpa messaging is showing no signs of ramping up as the firm pays up its (derisory) $5 billion fine to the US Federal Trade Commission (FTC) and $100 million to the Securities and Exchange Commission for making misleading disclosures.
While the company’s Apologist-in-Chief Nick Clegg bleats to the BBC that the firm has been “rocked to its very foundations” by the events of the past year or so, CEO Mark Zuckerberg has been giving his side of the story, insisting that it’s not just a case that the firm has “agreed” (!) to pay fines:
Even more importantly, we’re making some major changes to how we build our services and run this company. This will require investing a significant amount of our engineering resources in building tools to review our products and the ways we use data. It will also significantly increase our accountability by bringing the process for auditing our privacy controls more in line with how financial controls work at public companies with Sarbanes Oxley.
We’ll have to certify quarterly that we’re meeting all our privacy commitments. And just as we have an audit committee of our board overseeing our financial controls, we will now also have a new privacy committee of our board that will oversee our privacy program and work with an independent privacy auditor that will report to this new committee and to the FTC. We’re asking one of our most experienced leaders in product to take on the role of Chief Privacy Officer for Product, reporting to me and managing our privacy program. We’ll also be more rigorous in monitoring developers who access data through our platform. Together, we expect these changes will set a new standard for our industry.
All of this is a big deal, he insists, and one that’s going to take its toll on Facebook:
This is a major shift for us. We build services that billions of people trust every day to communicate with the people they care about. Privacy has always been important to the services we provide, and now it’s even more central to our future vision for social networking. It’s critical that we get this right, and we’re going to build it into all our systems. It’s going to take time to do this properly, and I expect it will take us longer to ship new products, especially while we’re getting this up and running. I also expect that, just as with the work we have been doing on safety and integrity, we’re going to continue to identify and fix issues as we develop our systems. But our goal is to build privacy protections that are as strong as the best services we provide, and this settlement gives us clear requirements moving forward.
On that last note, Zuckerberg follows up with a by-now familiar refrain - government needs to tell us what to do as we can’t be left to our own devices:
I’ve advocated that I believe the internet would benefit from governments setting clearer rules. I don’t believe it’s sustainable for private companies to be making so many decisions on social issues without a robust democratic process. Either the right regulations will get put into place, or we expect frustration with our industry will continue to grow.]
This quarter I spent time in Europe talking with policymakers about how this could work. I met with [French] President Macron to discuss a framework for harmful content. This is an area where I believe there could be an effective public process led by democratically-elected governments in Europe, and perhaps in the US, a process for industry standards and self-regulation.
But Facebook won’t be waiting for this to happen, he added:
We just released our third transparency report, which shows the progress we’re making on many categories of harmful content, including hate speech and graphic violence, but also the areas where we still need to do better, like bullying and harassment. I believe more companies should release transparency reports that enumerate the prevalence of harmful content on their services. This would help companies and governments design better systems for dealing with it. After next year, we’re going to publish these reports quarterly so people can hold us accountable at the same cadence as earnings calls.
We’re also moving forward with our plans for an independent oversight board for decisions on content. I believe it’s important that people can appeal the decisions, and that we’re creating systems that are transparent and that people can trust. We’ve been working with experts on freedom of expression around the world, and we’ve gotten a lot of public input on how this could work, which we published in a report last month. We expect to form this oversight board by the end of this year.
With the 2020 US Presidential Election campaigns now up and running, Facebook will also be stepping up its efforts in “protecting elections”, said Zuckerberg, a point picked up Chief Operating Officer Sheryl Sandberg:
We are committed to earning back trust through the actions we take. In the lead up to elections around the world, we’re doing all we can to get ahead of threats and develop smarter technology. The European Parliament elections in May were an important test for us. We created an operations center in Dublin to bring our experts together and make decisions quickly. We also worked closely with third parties across the EU, including 21 fact-checking organizations. These investments are starting to pay off and we remain committed to doing everything we can to stop bad actors.
We are also focused on increasing transparency. At the end of June, we made our transparency tools available globally for ads about social issues, elections or politics. These tools show who paid for an ad, how much they spent, and who saw the ad. Helping people understand who’s trying to influence their vote will help us better defend against foreign interference and other abuse. We’re going to continue to make investments to protect our platform, because it’s the right thing to do and it’s good for our business over the long-term.
As I’ve said on more than one occasion - that would bring a tear to a glass eye.
Any sympathetic hearing I might be inclined to afford Zuckerberg’s carefully-scripted repentance goes out the window when he talks about agreeing to pay that pathetic rounding error of an FTC fine, as though he had a choice and somehow expects kudos for such a humble act.
Rocked to its very foundations, says Clegg. Maybe so - but does that stem from what it did or from being caught out doing what it did?