More data privacy incidents from Facebook, Google, YouTube spark a fight back for the high ground

Profile picture for user slauchlan By Stuart Lauchlan September 10, 2019
Summary:
More data privacy problems for social media giants as Facebook makes a play to claim thought leadership. Yeah, good luck with that.

privacy-please

The US House of Representatives anti-trust panel will be looking into consumer data collection practices - and abuses - by major social media platforms at a hearing on Thursday. And judging by fresh revelations about Facebook and Google on the subject of data privacy, not a lot has been learned by either. 

Facebook has somehow managed to top its Cambridge Analytica revelations by providing one of the largest data breaches in history. While the Cambridge Analytica incident saw around 50 million user accounts illegally farmed, the firm now stands accused of leaving 419 million phone details sitting on an unsecured server along with Facebook ID details.

Facebook insists that following changes last year to the way users can search on the platform, there’s no real risk:

This dataset is old and appears to have information obtained before we made changes last year to remove people’s ability to find others using their phone numbers.

Nonetheless many critics, including European Union regulators, are likely to take the revelations as a further indication of Facebook’s lax attitude to privacy concerns.

Meanwhile Google is on a collision course with EU privacy laws after web browser rivals Brave accused the firm of using hidden web pages assigned to users to gather data that advertisers can harvest without consent. The accusations emerged as part of an ongoing investigation by the Irish data regulator into whether and how Google uses sensitive user data for advertising purposes.

Google insists there’s nothing to see here:

We do not serve personalised ads or send bid requests to bidders without user consent.The Irish Data Protection Commission and the UK Information Commissioner’s Office are already looking into real time bidding in order to assess its compliance with GDPR.

Separately, the firm’s YouTube arm has been hit with a $170 million fine from the US Federal Trade Commission (FTC) and New York Attorney General for the use of ad tracking data on kids videos, collected from children under the age of 13 and without parental consent contrary to the Federal Children’s Online Privacy Protection Act (COPPA).  According to FTC Chairman Joe Simons:

YouTube touted its popularity with children to prospective corporate clients…the company refused to acknowledge that portions of its platform were clearly directed to kids. There’s no excuse for YouTube’s violations of the law.

For its part, YouTube countered that its service is intended for kids over the age of 13 and that the YouTube Kids app and website - pitched at younger kids - doesn’t serve up targeted ads in the same way as the main site.

Fight back?

The $136 million that accounts for the FTC’s fine is be the largest it’s aimed at Google to date although considerably less than the $5 billion it hit Facebook with earlier this year. But that fine was widely derided as petty cash for Facebook and that’s a reaction that’s all the greater in relation to the Google penalty. Google parent Alphabet made a $30.7 billion profit last year.

But as scrutiny increases of the practices of tech giants, the firms themselves are engaged in more and more efforts to be seen to be reforming. Coming on the back of Facebook CEO Mark Zuckerberg’s plea for government to regulate his firm, the company has now called for regulators to put in place a global data sharing standard to be put in place so that companies - including presumably Facebook - are able to understand better what they can and can’t do.

In a white paper entitled Charting a way forward - data portability and privacy , the firm says it wants to “anchor conversations among stakeholders around the world”, the latest iteration of the PR push to assume a thought leadership role rather than the ‘in the dock’ position that has been the norm for the past couple of years:

The benefits of data portability to people and markets are clear, which is why our CEO, Mark Zuckerberg, recently called for laws that guarantee portability. But to build portability tools people can trust and use effectively, we should develop clear rules about what kinds of data should be portable and who is responsible for protecting that data as it moves to different providers

It sets out five “hard questions about how portability can be implemented in a privacy-protective way” that it wants regulators to answer:

  1. What is “data portability”?
  2. Which data should be portable?
  3. Whose data should be portable?
  4. How should we protect privacy while enabling portability?
  5. After people’s data is transferred, who is responsible if the data is misused or otherwise improperly protected?

There’s the inevitable pitch at claiming that Facebook’s always been big on privacy and data portability:

One of our core privacy principles at Facebook is that we enable people to control the use of their information on our services3 Guided by that principle, we have built tools such as the controls that allow people to select the audience for their profile information and their posts, as well as Ad Preferences, which helps people control how their information is used to show them ads…Facebook has been considering ways to improve people’s ability to transfer their Facebook data to other platforms and services for some time For example, since 2010, we’ve offered Download Your Information (“DYI”), which is designed to help people access and share their information with other online services.

That’s just as well given that the EU’s General Data Protection Regulation (GDPR) and the California Consumer Privacy Act make data portability a legal requirement with the prospect of more legislation - in the US and elsewhere -  coming down the tracks with similar demands.

But it’s all so confusing, complains Facebook:

Proponents of portability recognize that, in order to succeed, industry needs to address potential fundamental privacy questions, such as those we pose in this paper. But there has not been detailed guidance with respect to how service providers could or should balance the benefits to personal autonomy, innovation, and competition from portability against the potential risks to privacy and security…some guidance on portability seems at odds with other guidance on companies’ responsibilities for protecting against data misuse by third parties to which companies enable data transfers.

And full marks for sheer nerve when the firm taps into the Cambridge Analytica scandal to support its claims:

Particularly following the Cambridge Analytica matter, we’ve consistently heard calls from various stakeholders to limit the information that apps can receive through Facebook Login and to enhance our oversight of the apps that do receive that information. These calls suggest that some commentators may view the platform-to-app transfers of data as different from transfers made possible by “true” data portability For example, Facebook’s 2019 Consent Order with the FTC treats portability transfers separately from other transfers. By contrast, other commentators have suggested that Cambridge Analytica happened because of data portability, implying that platforms like ours (as well as iOS, Android, Twitter, and others) were already engaging in data portability when we enabled people to share their data with apps on Platform.

Meanwhile Google has open-sourced some of its privacy protection systems on GitHub in a move it says is intended to allow developers to build new tools that can collect data from users without revealing personally identifiable information.

According to a blog post from Miguel Guevara of Google’s Privacy and Data Protection Office:

Without strong privacy protections, you risk losing the trust of your citizens, customers, and users. Differentially-private data analysis is a principled approach that enables organisations to learn from the majority of their data while simultaneously ensuring that those results do not allow any individual’s data to be distinguished or re-identified.

My take

The latest Facebook and Google/YouTube revelations may relate to historical events, but they make clear that the skeletons are still rattling around in the social media platform closets - and that’s before we get focused on what ‘workarounds’ are going to be put in place to appease legislators around the world. Facebook’s data portability paper smacks of another phase in the mea culpa repositioning of the firm by Apologist-in-Chief Nick Clegg. That said, some of the points it raises and the potential policy conflicts it airs are worthy of examination. After all, if a company as large as Facebook and with an army of data scientists at its disposal is confused, what must the rest of us be like? Ahem…