What can enterprises learn from Facebook's no good, very bad week?
- Facebook just had a terrible, horrible, no good, very bad week. The stakes are high - and the enterprise lessons are plenty. Here's my analysis.
- Inside Facebook’s Hellish Two Years—and Mark Zuckerberg’s Struggle to Fix it All - Wired turned its attention on Facebook via a devastating long form epic that brought Facebook's algos-versus-humans Waterloo into sharp focus.
- Early Facebook and Google Employees Form Coalition to Fight What They Built - Insult-to-injury via a New York Times exposé on early employees/idealists that turned on their own creations.
Combing through these pieces will chomp the better part of your weekend. Either way, here's my enterprise boil down.
Wired focuses on a two year period when the human Trending Topics team - now disbanded - was engaged in the thorny endeavor of applying human curation to volatile topics, including, naturally, politics. A couple team members were fired for perceived or actual press leaks, heightening the internal tension. 51 Facebook employee/ex-employees were interviewed by Wired. Lessons:
1. Politics can no longer be extracted from the business. It's not just Facebook that's burdened with political curation problems. With social identities a factor in every workplace, online controversies can and will spill over, discrediting the notion of a neutral workplace.
One famous example: the Google employee whose infamous memo eventually got him canned - but not before exposing Google's cultural problems and putting the company in a difficult PR position (the employee is now suing the company).
2. Workers are aware of automation threats, and it affects their morale. The AI narrative still varies wildly from optimistic to apocalyptic. But I find the prevailing view is that AI will help users become more productive - if not superhuman. However: workers are well aware that automation could change their lives, and not necessarily for the better. Via Wired:
Eventually, everyone assumed, Facebook’s algorithms would be good enough to run the whole project, and the people on Fearnow’s team—who served partly to train those algorithms—would be expendable.
3. Techno-optimism is a fragile plank to build a culture on. Prior to Brexit and the U.S. election, tech companies like Facebook could sell their workers on the world-changing wonderfulness of their daily toil. That's a far more compelling narrative than the cynical/fragile "get rich with stock options" career promissory note.
The truth is that globalist sensibilities are facing a fierce challenge from an immigration-wary, nationalistic push in many countries. Tech companies currying favor across borders risk being caught between their globalistic ideals/talent strategies and the realities of working with problematic regimes. As Wired noted, this can backfire on employee morale:
The stories varied, but most people told the same basic tale: of a company, and a CEO, whose techno-optimism has been crushed as they’ve learned the myriad ways their platform can be used for ill.
4. "Move fast and break things" might be disruptive - but so is a boomerang when it comes back on your head. Wired documents how Facebook's early ethos of buying or conquering all adversaries came back to haunt. Attempts to crush Twitter by "newsifying" the News Feed algorithm had all kinds of unintended consequences.
Publishers were brought to heel if they wanted a piece of Facebook's eyeballs. But now we have a reversal of the News Feed algo philosophy to re-emphasize social content. And: Facebook is in hot regulatory water (If Facebook were classified as a media company it would usher in a range of unwanted financial/regulatory consequences, not the least of which is liability for user content).
5. Veering from your core values is easy and the fallout is huge. Perhaps the most unsettling part of the Wired piece is the portrayal of Facebook's too-late realization of its external News Feed election manipulation. Meanwhile, Facebook was obsessed with pleasing external constituents, such as conservative groups accusing the company's platform of political bias. No, there wasn't a smoking gun. But this was clearly a distracted company. Navel-gazing attempts to address News Feed quality issues were the equivalent of baby steps with hurricane season approaching.
Investor and Zuckerberg mentor Roger McNamee arguably could have done more to voice concerns about the strange manipulations he noticed on the platform, months before the election. Nevertheless, nine days before the vote, McNamee finally reached a breaking point. Wired shared part of his letter to Sheryl Sandberg and Mark Zuckerberg:
“I am really sad about Facebook,” [his letter] began. “I got involved with the company more than a decade ago and have taken great pride and joy in the company’s success … until the past few months. Now I am disappointed. I am embarrassed. I am ashamed.”
Early Facebook and Google Employees Form Coalition to Fight What They Built hits on similar themes, but with a broader angle beyond Facebook. The Center for Humane Technology is also concerned with the combined impact of addictive social sites and ubiquitous phones, particularly on children. Says one of the founders:
“Facebook appeals to your lizard brain — primarily fear and anger,” he said. “And with smartphones, they’ve got you for every waking moment.”
Former employees of Facebook join a chorus of dissent. Consider this from The Verge (Former Facebook exec says social media is ripping apart society):
Chamath Palihapitiya, who joined Facebook in 2007 and became its vice president for user growth, said he feels “tremendous guilt” about the company he helped make. “I think we have created tools that are ripping apart the social fabric of how society works,” he told an audience at Stanford Graduate School of Business, before recommending people take a “hard break” from social media.
Palihapitiya's concerns also extend beyond Facebook and election manipulation:
Palihapitiya’s criticisms were aimed not only at Facebook, but the wider online ecosystem. “The short-term, dopamine-driven feedback loops we’ve created are destroying how society works,” he said, referring to online interactions driven by “hearts, likes, thumbs-up.” “No civil discourse, no cooperation; misinformation, mistruth. And it’s not an American problem — this is not about Russians ads. This is a global problem.”
Facebook's woes are part of a bigger picture. It's a blow to techno-optimists, but for those of us who never believed in tech's utopian powers, a more realistic discussion is welcome.
I realize we are talking about a massively successful company that is 200 times more valuable than the New York Times - even if ad click fraud allegations from companies of all sizes will keep Facebook from getting too comfortable.
The Wired piece actually ends on a somewhat optimistic note, with a reflective Zuckerberg making changes that show Facebook is at least acknowledging a middle ground between outright publisher and neutral platform. Talk of boosting the content of trusted publishers in the algorithm is one sign of that.
These problems go well beyond business into the realm of a fundamental threat to democracy, exacerbated by a sped-up emoji culture ill-equipped to sort this.
But hold up - the future of democracy is beyond the scope of enterprise thinking. Or is it? There are no corporate externalities anymore - just problems we are implicated in that we can't afford to leave off the human balance sheet. Look no further than Unilever, which is warning Facebook it may pull lucrative ad contracts. Why? Not just because of ad fraud. But because of the consequences of playing do-gooder and neutral-platform at the same time. As per The Verge:
Facebook executives have pledged to tackle fake news and online fraud after Unilever warned tech firms that it will pull lucrative advertising contracts if companies allow their platforms to “breed division."
Confronting fears of automation and communicating openly about what we are up against is foreign to most companies. From a communications and policy angle, I don't have any cookie cutter answers.
But as I review Facebook's knee scrapes, one thing is clear: fence sitting is an illusion. If you don't take a position, you will be assigned one, and it might not be flattering.
That's true on a corporate level as well. Companies that have a commitment to globalism, diversity and equal opportunity can't just say pleasant things about their values. Nor can they cozy up to those policy makers who threaten those values. Taking a stand for a diverse workforce should not be avoided and no, this is not a political issue. Fear of those who will inevitably politicize and distort is a pathetic excuse for corporate compromise.
Uniliver didn't hold back. Their stern warning to Facebook reinforced their own anti-racist, anti-sexist values - while calling out Facebook on theirs. You don't win talent wars with a milquetoast version of your corporate culture.
Techno-optimism has taken a hit, but tech companies - in and out of the tech sector - have a chance to find their spines. Show us that the changes we are undergoing are not to be feared but to be faced head on. That will do wonders for morale and recruitment - and the blowback that comes is going to find us either way. Just ask Google - or Facebook.
End note: this post title was obviously inspired by the children's classic, Alexander and the Terrible, Horrible, No Good, Very Bad Day.