Main content

Cook and Zuck - a divine feud with enterprise implications for data privacy best practice

Stuart Lauchlan Profile picture for user slauchlan February 2, 2021
Apple's Tim Cook and Facebook's Mark Zuckerberg square off with increasingly sharp words in the latest tech sector catfight, but it's a tussle that has wider implications around privacy, both personal and corporate.


An inter-connected eco-system of companies and data brokers, of purveyors of Fake News and pedlars of division, of trackers and hucksters just looking to make a quick buck, is more present in our lives than it has ever been.

Who on Earth can Apple CEO Tim Cook be talking about? While never actually resorting to using the F-word directly, most of the audience listening to Cook’s impassioned address to the Computers, Privacy & Data Protection Conference last Friday would have been left in little doubt that it was a certain social media platform that was front and center as the target of his ire.

Given that Facebook CEO Mark Zuckerberg had himself taken time out to attack Apple less than 24 hours earlier, it's evident to al; that the enmity between the two firms is becoming ever more vehement. Why does it matter if the head of two tech behemoths get into a bitch-slapping contest? Because what they’re squabbling over is privacy, at both a personal and an enterprise level. The stakes here are high.

Apple bites

While Apple has worked hard in recent years to carve out a role for itself as a privacy champion, inadvertently aided by Facebook’s stumbling through crisis after crisis on the same front, tensions between the two have ramped up of late, largely due to the potential threat to the latter’s bottom line posed by new initiatives from the former.

Apple has two privacy developments in the wings that will hurt Facebook. The first is its Privacy Nutrition Label, which will mean that every app in the App Store must share information on its data collection and privacy practices in such a way that every user can understand and act on. In other words, keep it simple - no elongated obfuscation behind poly-syllabic legalistic lexicons.

The second is called App Tracking Transparency, which will enable users to see which apps had asked for permission to track their data and to block them from doing so if they wish. Given that Facebook needs to able to track basically everything that its users do online in order to prop up its ad business, this is pretty bad news for it. Apple may not be actively prohibiting tracking here, but it is handing users the power to shut themselves off from Facebook and others by requiring app makers to seek permission to gather data.

To demonstrate the scale of the problem here, Cook cited Apple research that found that everyday apps contain an average of six types of tracker:

Right now, users may not know whether the apps they use to pass the time, to check in with their friends or to find a place to eat, may in fact be passing on information about the photos they’ve taken, the people in their contact list, or location data that reflects where they eat, sleep or pray…It seems that no piece of information is too private or personal to be surveilled, monetised, and aggregated into a 360-degree view of your life. The end result of all of this is that you are no longer the customer; you are the product.

Facebook- defender of small businesses

Facebook’s response here has been to play the ‘defender of small businesses’ card, arguing that the issue isn't so much that Apple’s new rules will hurt larger firms - say, a multi-billion dollar turnover global social media platform provider, for example - but will in fact put ‘Mom & Pop’ outfits around the world through the ringer. It’s also going to mean that users won’t get the benefit of personalized ads on Facebook, just any old ads that get served up. The concern level associated with that last pitch is entirely dependent on how ‘targeted’ you feel what turns up on your Facebook page currently is, of course, but in Zuckerberg’s words:

When you hear people say that we’re connecting data from lots of sources, that’s to help small businesses reach customers more efficiently. Big companies often do this themselves, but small businesses can’t a lot of times, so we do this for them. When you hear people argue that we shouldn’t be doing these things or that we should go back to the old days of un-targeted television ads, I think that what they’re really arguing for is a regression, where only the largest companies have this capacity, small businesses are severely disadvantaged and competition is diminished. 

But given the growing levels of concern around the scope of data tracking, Facebook has to be seen to do something and  the something that it's chosen involves rolling out an in-app prompt aimed at educating - a carefully chosen word - users about how their data is being handled. Now being piloted in the US, this takes the form of a screen pop-up that asks permission to use data from third-party websites and apps, while also showing the user how data is used to personalize their ad experience. It’s all about providing “additional context”, according to Facebook.

Double standards?

Meanwhile Zuckerberg’s attack line is that Apple is acting out of commercial self-interest:

We are seeing Apple’s business depend more and more on gaining share in apps and services against us and other developers, so Apple has every incentive to use their dominant platform position to interfere with how our apps and other apps work, which they regularly do to preference their own. This impacts the growth of millions of businesses around the world, including with the upcoming iOS 14 changes [which mean] many small businesses will no longer be able to reach their customers with targeted ads. Now, Apple may say that they’re doing this to help people, but the moves clearly track their competitive interests. I think that this dynamic is important for people to understand because we and others are going to be up against this for the foreseeable future.

There are double standards at play here, he insists, pointing to Apple’s messaging tech:

We have a lot of competitors who make claims about privacy that are often misleading. Apple recently released so-called Nutrition Labels, which focus largely on metadata that apps collect, rather than the privacy and security of people’s actual messages. But, iMessage stores non-end-to-end-encrypted backups of your messages by default, unless you disable iCloud, so Apple and governments have the ability to access most people’s messages. So, when it comes to what matters most, protecting people’s messages, I think that WhatsApp is clearly superior.

But then again, Facebook has of late run the gauntlet of (more) bad publicity over plans to change WhatsApp’s Ts & Cs such that users will have to agree to new privacy rules, including sharing data with other Facebook products, or else lose their access to the app. This concern has all just been (another) terrible misunderstanding, according to Zuckerberg, as the changes are really all about businesses and customer service levels for users and therefore for the greater good of all:

The more people that interact with businesses, the better tools that we’re going to need to provide for businesses to help them support their customers. Many businesses need more than a phone to manage their customer service, so we’re building tools to let businesses store and manage their WhatsApp chats using our secure hosting infrastructure, if they would like. And we’re in the process of updating WhatsApp’s privacy policy in terms of service to reflect these optional experiences.

To clarify some confusion that we’ve seen - this update does not change the privacy of anyone’s messages with friends and family. All of these messages are end-to-end encrypted, which means we can’t see or hear what you say, and we never will, unless the person that you message chooses to share it. Business messages will only be hosted on our infrastructure, if the business chooses to do so.

The deadline for users to agree to the new WhatsApp terms of service - which have not been altered despite the bad reaction - has now been pushed back to May, leaving plenty of time for further escalation of the inflammatory rhetoric between Facebook and Apple - and for both users and enterprises alike to come to their own conclusions about who they trust here.

My take

As noted above, the stakes are high and users, both personal and corporate, need to decide which side they’re on here or risk being swept along by the tide towards a status quo over which they will have no influence. From my point of view, for all that Apple obviously does have commercial influences in play, I’m far more prepared to trust Cook than Zuck with my data any day of the week. Others may disagree - please do!  - but that’s where I stand.

On a much more trivial note, neither Cook or Zuckerberg come out of central casting as fiery orators declaiming from the corporate pulpit. On the face of it, this is not so much a Bette Davis and Joan Crawford style of divine feud as two techies getting mildly irate in terms of presentation style - mild-mannered Cook vs dorm nerd Zuckerberg. Those of us with long-enough memories may recall with fond nostalgia the increasingly shrill claims and counter-claims that characterised the database wars of the 1990s - AKA Larry Ellison vs the Rest of the Industry, Ellison strongly favored! - as a prime example of how tech tribes go to war.

But in fact, Cook’s simmering - and increasingly personal - anger brings out the importance of this debate on a wider scale:

The path of least resistance is rarely the path of wisdom. If a business is built on misleading users, on data exploitation, on choices that are no choices at all, it does not deserve our praise; it deserves reform...We can no longer turn a blind eye to a theory of technology that says, 'All engagement is good engagement, the longer the better’.

Too many are still asking the question, 'How much can we get away with?', when they need to be asking, 'What are the consequences?’. What are the consequences of prioritizing conspiracy theories and violent incitement, simply because of the high rates of engagement? What are the consequences of not just tolerating, but rewarding content that undermines public trust in life saving vaccinations? What are the consequences of seeing thousands of users join extremist groups and then perpetuating an algorithm that recommends even more? It is long past time to stop pretending that this approach doesn't come with a cost of polarization of lost trust - and yes, of violence.

A social dilemma cannot be allowed to become a social catastrophe.

Them’s fighting words!


A grey colored placeholder image