Facebook has spent a lot of time trying to convince the world that it should not be looked at as a publisher and therefore can’t be held to the same litigious account as newspaper or magazine producers. But this weekend saw a shift in that stance as CEO Mark Zuckerberg conceded that the firm is sort of a newspaper, sort of a telco - in other words a hybrid entity if you’re feeling generous, a mutant if you’re not.
Speaking at the Munich Security Conference in Germany, Zuckerberg returned to a by-now-familiar refrain - that of pitching the line that he needs to be regulated in order for people to be able to trust Facebook (Other social media miscreants are available.) As to what form that regulation should take, this is where the newspaper/telco line comes into play:
Right now there are two frameworks that I think people have for existing industries. There’s like newspapers and existing media, and then there's the telco-type model, which is 'the data just flows through you,' but you're not going to hold a telco responsible if someone says something harmful on a phone line. I actually think where we should be is somewhere in between.
While waiting for legislators to provide the necessary regulatory framework, he added, Facebook and others will just have to keep on trying. To indicate quite how selfless his firm is in this regard, he noted that the company now employs 35,000 people to review online content:
"In the absence of that kind of regulation we will continue doing our best. But I actually think on a lot of these questions that are trying to balance different social equities, it is not just about coming up with the right answer, it is about coming up with an answer that society thinks is legitimate.
Facebook also regularly suspends more than one million fake accounts each day, he said:
The vast majority are detected within minutes of signing up. Our budget is bigger today than the whole revenue of the company when we went public in 2012, when we had a billion users…In an area where we do quite well ,99% of ISIS propaganda etc that we take down, our AI identifies it before anyone sees it.
But he was quick to add that Facebook publishes over 100 billion pieces of content per day, meaning that:
It is simply not possible to have some kind of human editor responsible to check each one.
So regulation is needed and Zuckerberg is ready to ‘take the hit’ if necessary, he said:
Even if I’m not going to agree with every regulation in the near term, I do think it’s going to be the thing that helps create trust and better governance of the internet and will benefit everyone, including us, over the long term.
Ahead of a meeting with European Commission officials today, Zuckerberg also had a call to arms for them other Western authorities - deal with this issue or watch while others encode “authoritarian values” into internet regulation:
To encode democratic values, open values, we’ve got to move forward move quickly before more authoritarian models et adopted in a lot of places first. Where people can share and inform communities, that is a positive force.”
We don't want private companies making so many decisions about how to balance social equities without any more democratic process.
The Facebook boss returned to another familiar argument - that social media is ultimately a force for good and encourages the spread of more diverse world views than traditional channels, such as newspapers or TV networks:
Each media outlet has kind of their own editorial view and slant that they bring to things. So the data that we've seen is actually that people get exposed to more diverse views through social media than they were before through traditional media through a smaller number of channels.
A more interesting assertion was his reminder that Facebook users need to take some responsibility themselves for what appears in their accounts. Noting that the average user has around 200 ‘friends’, there’s not going be much variation in the views expressed, he suggested:
People are less likely to click on things and engage with them if they don't agree with them. So, I don't know how to solve that problem. It is not a technology problem, it is a social affirmation problem. The choice of what you see is based on the balance of what you share, rather than by choosing what you see. If your cousin has had a baby, we had better make sure that is near the top…The content that shows up in your news feed is primarily not determined by us. It's things that other people share. You choose who your friends are and what pages and what businesses you want to follow, and then, that's the content that's eligible to show up there.
The confirmation bias issue is a genuine problem and - brace yourselves - one where I have some sympathy for Zuckerberg. To take one example, the outcome of the Brexit vote in the UK was accompanied by howls of protest from those who voted to remain in the EU. The vote was rigged, ran the conspiracy theories, it must have been because no-one on my timeline was voting to leave! That said, Facebook hasn’t exactly made it simple to encourage users to take more explicit control of what turns up in their news feeds. I’d pride myself on being pretty on top of my privacy settings, but I still get worrying amounts of unwanted content turning up.
There’s still an awful lot of deflection going on with Zuckerberg’s messaging. He’s saying all the right things - or what Apologist-in-Chief Nick Clegg thinks are the right things - but the proof will come when legislators do start to roll out regulations. The lobbyists in Brussels and Washington will be ramping up their expense accounts at that point.
Oh, and he said he was cool with the idea of paying more tax outside of the US. But I’m not even going to bother with that one…