I agree with Nick - the cynical reading of Facebook's new 'Supreme Court' throws up a lot of important small print

Stuart Lauchlan Profile picture for user slauchlan January 29, 2020
Toothless PR gesture or an independent body with real bite? Facebook's new Oversight Board ticks a lot of boxes, but begs a lot of questions.

Facebook eye

There’s many a true word spoken in jest. Consider the following from Facebook’s Apologist-in-Chief Nick Clegg in an interview with Wired:

We know that the initial reaction to the Oversight Board and its members will basically be one of cynicism—because basically, the reaction to pretty well anything new that Facebook does is cynical.

As a rule of thumb, it pains me to agree with Clegg on just about anything, but on this occasion he’s right. And the publication yesterday of operational bylines for how Facebook’s ‘Supreme Court’ will work in practice provides no reason to change that opinion, for now at least.

The Facebook Oversight Board was first touted by CEO Mark Zuckerberg in the early days of the post-Cambridge Analytica scandal ‘mea culpa’ apology drive. The objective of the Board would be to make difficult calls around appropriate and inappropriate content that Facebook doesn’t want to be seen to make off its own back. Or as Zuckerberg put it: 

The purpose of this body would be to uphold the principle of giving people a voice while also recognizing the reality of keeping people safe.’

Zuckerberg has demonstrated something of a prediliction for outsourcing ‘the difficult stuff’ - see also his plea to government to regulate him for his own good (and everyone else’s) or the recent shameless abdication of responsibility when it comes to policing bare-faced lying by politicians.

That said, anyone who thinks that Zuckerberg is about to loosen his control on the company and its policies is sadly mistaken. The bylines presented this week tick a lot of the right boxes, but with just enough get-out-of-jail-free cards built in to ensure that at the end of the day it will be Facebook management who will be pulling the strings.

Read the small print

The bylines themselves are a daunting read - 46 pages of legalese. But the important elements - from a public image perspective at any rate - are that the new Board will provide a mechanism for appeal. Users will have 15 days to petition members of the Board - and there will be around 40 of those - to remove posts from Facebook or Instagram. The Board then has a leisurely 90 days to come to a decision, so it’s not exactly going to be moving at ‘internet speed’. After a decision has been reached, there will be a 7 day period in which it should be implemented/enforced.

There will however be an expedited review process for cases that are judged to have “urgent, real world consequences”. In those situations, users will only have to wait a month for a decision, which may cause some of Clegg’s cynics to reflect on the definition of the word ‘urgent’

Heather Moore, one of authors of the bylines produced by the Facebook governance team, spins the timescales thus:

That’s the maximum amount of time that the Board will take. Once the Board really gets up and running, we expect to improve its processes. It also may not take that long. It really depends on the type of case that they are reviewing. But there are other provisions in the bylaws that really allow for the board to request research from outside experts and really (fully) consider the matter. Depending on the issue that’s before it and the type of research that they want to commission to really build out their understanding of the issue that’s at play, its impact on the platform, we want it to be able to give them the amount of time necessary to really do that.

Leaving the lengthy decision time aside, there are other problems immediately apparent with the Facebook guidelines, most notably what they do - or rather don’t cover. There is a whole stack of Facebook product offerings over which the Board will have no jurisdiction as the small print reveals:

The following types of content are not available for the board’s review, unless reassessed in the future by Facebook: Content types: content posted through marketplace, fundraisers, Facebook dating, messages, and spam. Decision types: decisions made on reports involving intellectual property or pursuant to legal obligations. Services: content on WhatsApp, Messenger, Instagram Direct, and Oculus.

So if you want to post something offensive, choose your platform with care and you should get away with it.

Then there’s the fact that, certainly to begin with, the Board will only take on cases where content has actually been taken down, not when allegedly offensive or inaccurate content is still online. So if you’re a political group that’s posted a blatant lie and the post has been left up, the Board won’t be interested. If you’re that same group and your post has been taken down, then the Board will look into whether that was fair or not. With the US election due in November, that leaves a clear run for vested interests to keep on exploiting Facebook’s indifference to hosting misleading campaign information and propaganda.

If a case is brought successfully and the Board rules against Facebook’s decision, the spin from the company is that everyone, up to and including Zuckerberg, has to listen. But the bylines confirm that what will actually happen is that Facebook will review each individual decision by the Board on a case-by-case basis. A single ruling on a particular topic will not be considered to have set a precedent for other similar cases.

Still, at least Facebook has put its money where its PR mouth is, committing to spend an initial $130 million over 6 years to fund this new entity. That is certainly good, but once again there’s a get out built in. The actual wording is that Facebook will fund for “at least six years”. The “operational and procedural effectiveness” of the Board will reviewed annually with the unescapable implication that if it proves to be a pain-in-the-proverbial, there’s no commitment to take it any further than that 6 year term, then quietly drop it. 

The right man for the job?

On a more positive note, Facebook has made a good pick to head up the Board’s administration in the shape of Thomas Hughes, former Executive Director of digital rights activist group Article 19. Hughes will not work for Facebook, but for the Oversight Board as a separate structure. He says of his new appointment:

I see the role of director as a direct continuation of my work thus far over the last couple of decades because the oversight board has been created to ensure the rights of individuals are respected and that there is transparency and accountability in the application of the community standards.

But don’t expect immediate progress, he cautions:

Over the coming months I’ll be focusing on setting up the administration for the oversight board so that the board members can select which cases they would like to hear based on clear and transparent criteria and then be able to effectively and efficiently and confidentially review those cases. This will require us to hire staff and to set up processes and adopt tools to review cases which is an enormous undertaking and will take us many months.

In addition, Hughes says he wants the Board’s make-up to cover a broad spectrum of views:

The Board will be global and will therefore reflect a breadth of perspectives. I’m sure that there will be board members with whose opinion both you or I might not agree. However, this diversity is at the very heart of the Board’s rationale. I’m confident it will mean that we have stronger outcomes as a result.

My take

I agree with Nick - there will be lots of cynics about this venture and I’m one of them.

I hope that cynicism is eradicated by positive results in practice. When Hughes was at Article 19, he was one of 70 signatories to a letter to Facebook warning that it needed to think about the global implications of the power of content on its platform. That letter read:

As the world’s biggest communications platform, Facebook has the power to shape the news and content that we get to see. When content is removed in error, there are consequences for global freedom of expression.

As he starts his new role inside(ish) Facebook, Hughes has that quote front of mind. I don’t doubt his sincerity based on his track record as an digital activist. But it’s still Mark Zuckerberg’s world; we only get to ‘Like’ in it.


A grey colored placeholder image