Sen. Mark Warner’s 15 'common sense' rules for Social Media platforms

Profile picture for user Jerry.bowles By Jerry Bowles August 1, 2018
An influential American senator believes a national conversation about privacy, data and social media is long overdue. Here are his proposals on how to balance technology regulation and innovation.

social media

The average American senator is 61 years old which doesn’t necessarily mean they know diddly squat about the internet, social media, hacking, disinformation and the like. but past experience suggests that most of them don’t know much.  One conspicuous exception is Senator Mark Warner of Virginia, who founded his first cell phone company many years ago while still living out of his car and went on to launch Nextel. That venture gave him a front-row seat on how the mobile phone has changed the world.  Not coincidentally, he is also Vice Chair of the Senate Intelligence Committee, which is looking into Russian interference in the 2016 Presidential election.

In a by-lined article in USA Today on August 1, Warner points out that some of the largest and most powerful companies in the world--Google, Facebook, and Twitter—for example--build and rely on technology that didn’t exist a decade or two ago, dramatically transforming society along the way.  Warner wrote:

As someone who was in the tech business longer than I’ve been in the Senate, I’m a big believer in the power of technology to improve people’s lives. But anyone who’s ever told their kids to put their cell phones away at dinner knows technological advances sometimes come with unintended consequences—and social media is no exception… I believe a national conversation about these issues is long overdue. The companies, the federal government, and individuals all share in the responsibility to make sure these great American technologies continue to work for the good of our county and its citizens.

That conversation, he wrote, has to start with companies taking responsibility for their platforms and the potential for their abuse and some common-sense rules of the road for social media.  Earlier this week, Sen. Warner released a lengthy white paper spelling out some specific proposals.

I’ve attempted to digest and number those proposals here but the document itself is dense and attempts to cover all possibilities so I urge you to read the entire entry for each of them that interest you.  As Sen. Warner wrote in Potential Policy Proposals for Regulation of Social Media and Technology Firms 

The purpose of this document is to explore a suite of options Congress may consider to achieve these objectives. In many cases there may be flaws in each proposal that may undercut the goal the proposal is trying achieve, or pose a political problem that simply can’t be overcome at this time. This list does not represent every idea, and it certainly doesn’t purport to answer all of the complex and challenging questions that are out there. The hope is that the ideas enclosed here stir the pot and spark a wider discussion--among policymakers, stakeholders, and civil society groups--on the appropriate trajectory of technology policy in the coming years.

Here is my digest of the document taking the headlines used by Sen Warren as waymarks and verbatim copy where it makes the most sense:

  1. Clearly and conspicuously label bots – To protect consumers, and to inhibit the use of bots for amplification of both disinformation and misinformation, platforms should be under an obligation to label bots--both those they provide (like Google’s Duplex) and those used on the platforms they maintain (e.g. bot-enabled accounts on Twitter). California lawmakers have proposed something like it--colloquially referred to as a ‘Blade Runner law’ after the 1980s movie--to do just this.
  2. Determine origin of posts and/or accounts – Anonymity and pseudo-anonymity on social media platforms have enabled bad actors to assume false identities (and associated locations) allowing them to participate and influence political debate on social media platforms. Forcing the platform companies to determine and/or authenticate the origin of accounts or posts would go far in limiting the influence of bad actors outside the United States.
  3. Identify inauthentic accounts –  Inauthentic accounts not only pose threats to our democratic process (with inauthentic accounts disseminating disinformation or harassing other users), but also undermine the integrity of digital markets (such as digital advertising).  A law could be crafted imposing an affirmative, ongoing duty on platforms to identify and curtail inauthentic accounts, with an SEC reporting duty to disclose to the public (and advertisers) the number of identified inauthentic accounts and the percentage of the platform’s user base that represented.
  4. Make platforms liable for state-law torts (defamation, false light, public disclosure of private facts) for failure to take down deep fake or other manipulated audio/video content. Currently the onus is on victims to exhaustively search for, and report, this content to platforms who frequently take months to respond and who are under no obligation thereafter to proactively prevent the same content from being re-uploaded in the future.
  5. Pass a Public Interest Data Access Bill – Regulators, users, and relevant NGOs lack the ability to identify potential problems (public health/addiction effects, anticompetitive behavior, radicalization) and misuses (scams, targeted disinformation, user-propagated misinformation, harassment) on the platforms because access to data is zealously guarded by the platforms. We could propose legislation that guarantees that platforms above a certain size provide independent, public interest researchers with access to anonymized activity data, at scale, via a secure API. This would ensure that problems on, and misuse of, the platforms were being evaluated by researchers and academics, helping generate data and analysis that could help inform actions by regulators or Congress.
  6. Require Interagency Task Force for Countering Asymmetric Threats to Democratic Institutions - The intelligence and national security communities are not as well-positioned to detect, track, attribute, or counter malicious asymmetric threats to our political system as they should be. From information operations to cyber-attacks to illicit finance and money laundering, our democratic institutions face a wide array of new threats that don’t fit easily into our current national security authorities and responsibilities.  Standing up a congressionally-required task force would help bring about a whole-of-government approach to counter asymmetric attacks against our election infrastructure and would reduce gaps that currently exist in tracking and addressing the threat.
  7. Enact Disclosure Requirements for Online Political Advertisements – Because outdated election laws have failed to keep up with evolving technology, online political ads have had very little accountability or transparency, as compared to ads sold on TV, radio, and satellite. Improving disclosure requirements for online political advertisements and requiring online platforms to make all reasonable efforts to ensure that foreign individuals and entities are not purchasing political ads seem like a good first step in bringing more transparency online.
  8. Create a Public Initiative for Media Literacy – Addressing the challenge of misinformation and disinformation in the long-term will ultimately need to be tackled by an informed and discerning population of citizens who are both alert to the threat but also armed with the critical thinking skills necessary to protect against malicious influence. A public initiative-- propelled by federal funding but led in large part by state and local education institutions-- focused on building media literacy from an early age would help build long-term resilience to foreign manipulation of our democracy
  9. Increase Deterrence Against Foreign Manipulation – We have to admit that our strategies and our resources have not shifted to aggressively address these new threats in cyberspace and on social media that target our democratic institutions. Russia spends about $70 billion a year on their military. We spend ten times that. But we’re spending it mostly on physical weapons designed to win wars that take place in the air, on land, and on sea. While we need to have these conventional capabilities, we must also expand our capabilities so that we can win on the expanded battlefields of the 21st century. Until we do that, Russia is going to continue getting a lot more bang for its buck.
  10. Establish Information fiduciaries – Yale law professor Jack Balkin has formulated a concept of “information fiduciaries”--service providers who, because of the nature of their relationship with users, assume special duties to respect and protect the information they obtain in the course of the relationships. Balkin has proposed that certain types of online service providers --including search engines, social networks, ISPs, and cloud computing providers--be deemed information fiduciaries because of the extent of user dependence on them, as well as the extent to which they are entrusted with sensitive information.
  11. Enact Comprehensive (GDPR-like) data protection legislation – The US could adopt rules mirroring GDPR, with key features like data portability, the right to be forgotten, 72-hour data breach notification, 1st party consent, and other major data protections. Business processes that handle personal data would be built with data protection by design and by default, meaning personal data must be stored using pseudonymisation or full anonymization. Under a regime similar to GDPR, no personal data could be processed unless it is done under a lawful basis specified by the regulation, or if the data processor has received an unambiguous and individualized consent from the data subject. (
  12. Enact Data Transparency Bill – The opacity of the platforms’ collection and use of personal data serves as a major obstacle to agencies like the FTC addressing competitive (or consumer) harms. This lack of transparency is also an impediment to consumers ‘voting with their wallets’ and moving to competing services that either protect their privacy better or better compensate them for uses of their data. ,
  13. Pass Data Portability Bill – As platforms grow in size and scope, network effects and lock-in effects increase; consumers face diminished incentives to contract with new providers, particularly if they have to once again provide a full set of data to access desired functions. The goal of data portability is to reduce consumer switching costs between digital services (whose efficiency and customization depends on user data). A data portability requirement would be predicated on a legal recognition that data supplied by (or generated from) users (or user activity) is the users’--not the service provider’s. In other words, users would be endowed with property rights to their data. This approach is already taken in Europe (under GDPR, service providers must provide data, free of charge, in structured,commonly-used, machine-readable format).
  14. Open federal datasets to university researchers and qualified small businesses/startups - Structured data is increasingly the single most important economic input in information markets, allowing for more targeted and relevant advertisements, facilitating refinement of services to make them more engaging and efficient, and providing the basis for any machine learning algorithms on which all industries will increasingly rely. Large platforms have successfully built lucrative datasets by mining consumer data over significant timescales, and separately through buying smaller companies that have unique datasets. For startups and researchers, however, access to large datasets increasingly represents the largest barrier to innovation Congress could ensure that this data be provided only to university researchers and qualified small businesses, with contractual prohibitions on sharing this data with companies above a certain size.
  15. Define Essential Facilities – Certain technologies serve as critical, enabling inputs to wider technology ecosystems, such that control over them can be leveraged by a dominant provider to extract unfair terms from, or otherwise disadvantage, third parties. Legislation could define thresholds--for instance, user base size, market share, or level of dependence of wider ecosystems – beyond which certain core functions/platforms/apps would constitute ‘essential facilities’, requiring a platform to provide third-party access on fair, reasonable and non-discriminatory (FRAND) terms and preventing platforms from engaging in self-dealing or preferential conduct. In other words, the law would not mandate that a dominant provider offer the service for free; rather, it would be required to offer it on reasonable and nondiscriminatory terms (including, potentially, requiring that the platform not give itself better terms than it gives third parties).

Last word

From Senator Warner

The size and reach of these platforms demand that we ensure proper oversight, transparency and effective management of technologies that in large measure undergird our social lives, our economy, and our politics. Numerous opportunities exist to work with these companies, other stakeholders, and policymakers to make sure that we are adopting appropriate safeguards to ensure that this ecosystem no longer exists as the ‘Wild West’--unmanaged and not accountable to users or broader society—and instead operates to the broader advantage of society, competition, and broad-based innovation.

No argument from me.