Weekend musing - a simple Kill Switch solution to the mess that is Facebook

Profile picture for user gonzodaddy By Den Howlett July 21, 2018
Summary:
Facebook can't seem to catch a break in the court of public opinion but there may be a simple solution to its woes. Bring on the Kill Switch.

kill switch
Could a user-friendly Kill Switch provide Facebook users the one thing that's missing from the eponymous social platform?

Last week, Facebook got itself in hot water (again) when CEO Mark Zuckerberg stepped onto the Holocaust denier landmine by saying that those folk should not be kicked off Facebook. Outrage ensued across social and traditional media channels, along with some reasoned analysis and suggestions for a fix.

In her weekly missive, Jessica Lessin, CEO and founder of The Information believes that government - presumably the US government - will inevitably step in and regulate Facebook as a publisher:

Forcing Facebook to be a publisher, which heavily edits and moderates content, may solve some problems. Regulators would be happy because Facebook would be easier to control. There would be fewer PR catastrophes caused by content deemed offensive.

I should say right here that this is the inevitable direction things are headed. Public and regulatory pressure is going to push Facebook into the media company terrain more and more. Its business model sort of makes this inevitable too. As long as it is making money off ads, it is hard to argue that the utility use case is dominant. I may believe that conspiracy theorists have a right to use communication tools, but I don’t think Facebook should be profiting from them.

But we should think through the consequences of Facebook exercising a heavier content moderation hand. That direction will stifle a huge number of views and close off groups of people to tools and communication. Some people, say child pornographers, are so abhorrent, we may not care. But don’t fool yourself. Most cases would fall in grey areas—things that aren’t true or false but are deeply offensive to some.

In other words, Facebook would draw a line and dissenting views would be forced offline to other platforms.

For his part, Josh Bernoff comes up with a different solution that is similar to the way Wikipedia works and is one that relies on Facebook coming up with a credible policy that has teeth:

Facebook should make it possible for individuals to apply to be “citizen fact-checkers.”

If you suggested yourself for this volunteer position, you could mark sites for review. You would have to do work. You would have to identify which elements of the site were false and which were hateful.

In the beginning of your work in this position, Facebook would not give your effort much weight.

But over time, Facebook would review your selections.

Facebook knows which sites are popular with different group. For example, they know which sites Trump voters like, and which sites liberals prefer.

Facebook would review your selections for balance. If you tended to mark only conservative sites for deletion, your vote would not count. The same applies to people who marked only liberal sites.

But if you were balanced in your selections, your vote would have more weight.

Facebook would remove sites that were marked for deletion by large numbers of citizen fact-checkers who have a balanced perspective, and were able to identify clearly false and hateful elements.

On their face, both solutions sound credible if hard to implement but in my view they represent a uniquely American view of the world as it relates to media.

From what I can gather as a non-American attempting to understand how these things work, it seems that freedom of speech as enshrined in the First Amendment to the American Constitution is so sacrosanct that finding an answer to Facebook’s problems within that framework resembles something akin to the Labors of Hercules.

To me, there is a much easier and simpler solution. Forget moderation except as it relates to ‘speech’ that is clearly designed to hurt others and which is broadly understood under the laws applicable to the territories in which Facebook operates. That, by the way, would solve for Holocaust Deniers in Germany where it is against the law to deny the Holocaust.

Instead, give users the equivalent of a Kill Switch for content they don’t want to see.

Something with which I fervently disagree with or which offends turns up? Kill it so I don’t have to see it. Let those who want to have conversations I find too whacked out for my taste remain and carry on in their own filter bubble but give ME the choice as to whether I see it or not.

That way, no-one’s freedom of expression is impacted, however crazy I might find it, while I can happily ignore. We can go one step further and allow people to moderate conversations inside spocial media.

If that sounds too simplistic then consider this. If you are reading this now, then you are choosing to read what’s said. I don’t mind if you disagree in your comments, provided that you’re respectful of my POV. Start with offensive language or ad hominem attacks and I’ll moderate you out. If you persist then I’ll block you. This is what happens every day in any other kind of responsible media. In similar vein, the range of publications I read will be largely unique to me. It will include titles with which very few will be familiar. It also excludes titles I find risible. The point is I have that choice everywhere except Facebook and arguably Twitter.

Lessin thinks that we should be careful about what we might lose as the spectre of regulating Facebook comes into view. While making the media argument, she stays that comparing Facebook with the New York Times is ‘silly’ based on a comparison of scale. That argument is illogical and more so if you think (as many already do) that Facebook is media.

You may be wondering how this proposal jives with Facebook's renewed algorithmic emphasis on "friends and family." If your Newsfeed is largely filled with updates from friends, and you can "mute" any post you see from a friend to avoid further updates, isn't that good enough?

No. Whether a Facebook posts links to content I find offensive on or off Facebook, a Kill Switch should prevent certain sites and topics from displaying in my feed - even if my most trusted pal pushes the content. And yes, that goes for third-party advertising too. As for friends posting objectionable views of their own, well, even Zuckerberg can't help you choose your friends.

So c’mon Zuckerberg, give us the Kill Switch. You don’t lose anything. If anything you gain because now you really can target your precious ads while simultaneously solving everyone’s problems in one go.