A Vanity Fair article sums the position nicely:
In the wake of Donald Trump’s shock electoral victory Tuesday, Mark Zuckerberg is forcefully pushing back against mounting criticism that Facebook enabled the Republican’s rise by failing to police an explosion of fake, pro-Trump news stories that went viral on the social network. “Personally,” he said onstage at a conference Thursday, “I think the idea that fake news on Facebook, which is a very small amount of the content, influenced the election in any way—I think is a pretty crazy idea. Voters make decisions based on their lived experience.” He added: “Why would you think there would be fake news on one side and not the other?”
The apologist perspective
Still, fake news stories are frustrating beyond belief. They not only misinform. They also threaten one of the sacred tenets of the news media, which is that reputation for truth matters and that over time readers lose faith in outlets that publish bad information. That equation is broken, and it rattles the news media to its core.
But Facebook can’t fix it because the more you pull back the curtain, the more complicated it gets. Stories from publications that flat out don’t exist are easier to deal with, and Facebook already has a way of doing so that I think makes sense. Users can flag them as false so fewer people will see them. Facebook could go a step further and make sure the fact they have been flagged is very clear to the user, the opposite of the verified blue check mark it uses to mark legitimate businesses. It already does something similar in how it labels parody news sites as satire.
Lessin uses this to get into her favorite topic of a broader media critique but not before she attempts to position Facebook as a good actor:
We shouldn’t let Facebook off the hook for every problem it creates or exacerbates. But we can’t hold it responsible for each of them either. We’re witnessing the effects of a world where the internet has driven the cost of saying whatever you want to whomever you want to zero, as Sam often says. This is an irreversible trend no company can stop, nor should we want them to.
In specific cases, like harassment and bullying, I think Facebook has an obligation to build safeguards into its products, and the company does. I don’t think misinformation, while malicious, is the same type of issue. As hard as defining harassment is, defining “truth” is way harder and probably impossible. It’s not as easy as figuring out what did or didn’t happen, which, ask any reporter, is rarely easy. It also requires mediating different points of view.
One final thought. The news media’s obsession with fake news scares me for one more reason—it is the latest example of how the industry is constantly pointing its finger elsewhere to explain its failings. Witness the claim “We didn’t miss the real story in this election; it was the polls.” Or, “it’s not that we aren’t serving readers with valuable stories; it’s that print advertising is down, and we can’t control it.”
Not buying it
I have a BIG problem with Lessin's analysis. On the one hand I have absolutely no problem with her contention that the cost of content distribution is zero. All of us who develop content in the broadest sense need to understand this.
But arguing that something is hard is not a reason for sweeping aside criticism. By Lessin's logic, we should not be harsh on Twitter when it is used for abuse. But we're not that easy on Twitter.
During the election, my timeline was inundated with false stories of all kinds. It made dissemination of fact from fiction a process involving huge gobs of friction and a massive uplift in my use of Snopes as the final fact checker.
My bigger problem though lies in Lessin's contention that it is primarily media's responsibility to fact check and that in any event, Facebook has enough 'fact checkers' to counter the impact that false news has on the reader.
Fact checking is a critically important role in a society. But I firmly believe it’s the news media’s job. As fake stories about the Denver Guardian popped, up, the actual Denver Post knocked them down. Yes, the fake probably spread faster than the real, but both can reach a lot of people.
I partially disagree. Yes, fact checking is important but the problem with Facebook's timeline is that it is almost wholly governed by things I see from like minded people. During the last month of the election I saw a non stop barrage of content that supported Hillary Clinton and denounced Donald Trump as spawn of the devil.
Nowhere did I see any form of reasoned debate about the policies underpinning their candidacy, either from media or from the people who appeared in my timeline.
Neither did I see much attempt at halting the spread of falsehoods. In one sense, out felt like people didn't care as long as their chosen candidate was slamming the other.
In short, all I saw was one endless advert for a single candidate, often wrapped up in visceral terms. It got to the point where I switched browsers and used a blocking tool so that I didn't see much of what was going on. That was far from ideal but it saved me from the flooding of election related content. In all this, Zuckerberg doesn't get a free pass either. This Tweet popped up in my timeline today:
Facebook is now in the awkward position of having to explain why they think they drive purchase decisions but not voting decisions
— Casey Newton (@CaseyNewton) November 11, 2016
Let that sink in for a moment.
The Twitter comparison
Here's the thing. I don't follow the author of this Tweet but people with whom I am connected do. As a result, the Tweet above popped straight into my timeline. That would not happen on Facebook. As Ben Thompson said in his Stratechery newsletter on election day:
...technology has made objective truth a casualty to the pursuit of happiness — or engagement, to use the technical term — and now life and liberty hang in the balance...
...what is much more disturbing are the revelations that fake news is widespread in Facebook’s news feed; unsurprisingly, given they are human, many Facebook users wish to connect with people and things that confirm their pre-existing opinions, whether or not they are true. Make no mistake, this results in a great business: I have written effusively about Facebook’s financial potential and noted that the News Feed algorithm is a big reason why Facebook Squashed Twitter. Giving people what they want to see will always draw more attention than making them work for it, in rather the same way that making up news is cheaper and more profitable than actually reporting the truth.
And yet it is Twitter that has reaffirmed itself as the most powerful antidote to Facebook’s algorithm: misinformation certainly spreads via a tweet, but truth follows unusually quickly; thanks to the power of retweets and quoted tweets, both are far more inescapable than they are on Facebook.
There are plenty of people I know who have all but abandoned Twitter and I get that. Unless you are super skilled, the 140 character limit crimps any ability to have meaningful discourse while Facebook is a superbly efficient platform for that purpose.
But then Twitter is super fast at correcting falsehoods or pointing me in the direction of alternative points of view that I can digest alongside the Tweetstream. In that, I wholly agree with Thompson's assessment that Twitter cannot be allowed to whither on the digital vine.
Let's get back to that Tweet. I've argued on occasion that Facebook is an advertising fraud. To me, sending me false fans is no different to spreading false stories. If anything, the criticism Facebook faces today is only one side of a very difficult problem that Facebook has to solve, while at the same time serving its own interests. How it does this - or doesn't - will have a profound impact on media buyers and those who want to purvey marketing messages. Marketers are in the business of selling trust based stories. If they perceive that Facebook is taking little or no responsibility for that essential job then they'll be gone in a heartbeat.
While Facebook may well be the consumer media distribution channel du jour for much of America and the English speaking world, it doesn't mean that its domination will last forever. Zuckerberg may well downplay the incidence of falsehood but that really doesn't hold water when objectively tested against what I and others were seeing in our timelines. Facebook and the 'truth' will live long as a topic of intense debate.
Twitter's role remains important but we should all be very careful about ascribing cause and effect in an election where pretty much everyone failed.
For our part, we will continue to bat for the informed technology buyer, making clear our biases along the way. Therein lies our hope that readers will take what we say with that explicitly in mind because to avoid that objective reality would be a massive dis-service.