Is distrust slowing innovation? Why personal digital sovereignty could help

George Lawton Profile picture for user George Lawton March 8, 2024
With trust in privacy controls and AI at an all-time low, it's worth asking how much this might be slowing innovation.


The recent US Executive Order on the adversarial use of American sensitive data is drawing increased attention to the current state of data aggregation practices. Stronger privacy controls will add complexity to the $231 billion data US data-sharing industry. 

However, distrust in the status quo may also discourage the wider sharing of information, slowing economic growth and impeding progress toward sustainability goals. This distrust tax on innovation occurs because individuals and businesses have little visibility or control over how their data may be used against their interests after sharing. 

Some evidence suggests that efforts to #acceleratetrust can have an outsized impact on the economy's growth. The £40 million invested to set up the trust framework in Open Banking is estimated to have contributed over £6 billion to the UK economy. A more ambitious effort to develop a framework for trusted data intermediaries as part of a pending UK data bill is predicted to contribute £30.5 billion annually to the UK economy. 

Gavin Starks, CEO of IcebreakerOne, who co-chaired the birth of the Open Banking Standard, says:

In my view, that's still just the starting point. We're all on a journey to sort out the social contract that's been missing between citizens & consumers, the businesses that use our data, and public interest (e.g. health) as managed by government. Regulation has a massive role to play, and this is one area where the UK beats the USA (more collaboration) and the EU (we're faster).

Increasing privacy and data management controls will certainly add costs for enterprises that stand to gain from processing our data. It may also slow the large market for sharing data. But Davi Ottenheimer, VP of Trust and Digital Ethics at Inrupt, argues that focusing on the value of the data products market alone misses out on the bigger picture: 

While it may appear that any sizable business is contributing to revenue and job creation, it's essential to consider the broader context. An excessively narrow focus on these metrics might obscure the fact that the potential for even greater growth in revenue and jobs is being hindered or compromised. The absence of adequate privacy safeguards, data security measures, and consumer protections within the data brokerage industry can have significant adverse effects on the overall economy.

Incidents of data breaches and the improper use of personal information are well-documented to result in severe financial and reputational damage. For instance, Ottenheimer was once brought to a country to investigate how to safeguard citizen privacy. This arose from concerns that data leaks, exploited by unscrupulous data brokers, were impeding fair and lawful commerce. The country's GDP was believed to be stagnating or declining due to its under-regulated data brokerage market.

The trust gap

The most recent Edelman trust barometer reported widespread distrust in AI that is trained and applied to our data, leading to a new paradox at the heart of society. CEO Richard Edelman writes:

Rapid innovation offers the promise of a new era of prosperity, but instead risks exacerbating trust issues, leading to further societal instability and political polarization.

Their annual survey found that 43% of respondents globally will reject AI, avoiding products and services incorporating it if they believe innovation is being managed poorly. There are also wide variances across income levels in the US, with 43% of high-income respondents trusting AI compared to 27% of low-income respondents. 

Perhaps surprisingly, China led the 2024 Trust Index, given its portrayal in Western media as an authoritarian state kept afloat by extensive censorship and surveillance. In China, 79% reported trusting NGOs, businesses, government, and media, down from 83% last year. The US came in at 46%, and the UK at 39%. If trust indeed fuels innovation, this difference has disturbing implications for Western competitiveness. 

Extending data sovereignty to individuals

One way to beat the distrust tax on innovation is to shift the conversation from data privacy for securing sensitive data to individual data sovereignty. Data sovereignty is often discussed at a national level for restricting what region data can be processed. Individual data sovereignty takes this a step further for people and businesses, with cryptographic safeguards for controlling the fate of your data after you hand it off to a vendor, service, or business partner. 

Ameesh Divatia, founder and CEO of Baffle, says:

For companies and governments that take personal data sovereignty seriously, there is tremendous upside because it will enhance their business reputations. Much like sustainable development, responsible handling of subject data is not only ‘good business’ but ‘good for business’ as well.

Tim Berners-Lee, the inventor of the World Wide Web and co-founder of Inrupt, has been a vocal advocate for data sovereignty at the individual level. The Solid project, initiated by Berners-Lee in 2016, is a key part of this vision- no blockchain is required. The World Wide Web Consortium (W3C) Solid project promotes the idea of individuals having control over their data. W3C Solid provides tools and standards for decentralized web applications, allowing users to push all their data into personal stores (pods) and grant permission to various applications and services to access this data.

The government of Flanders in Belgium is an early Solid adopter for all its citizens. This combines data governance and management, empowers citizens and enhances trust in digital interactions. Ottenheimer explains: 

This approach empowers individuals to manage their digital identities and grant consent to data access in a more secure and transparent manner, aligning with the principles of data sovereignty. Berners-Lee's work underscores the importance of putting users at the center of the web and ensuring that they have autonomy over their personal information.

Prioritizing personal data sovereignty could also have vast potential for economic impact and progress toward sustainable development goals. Ottenheimer points to regulatory shifts like SB1386 in 2003 and PCI DSS in 2006 that demonstrated how emphasizing privacy can lead to significant market expansions based on trust, such as the surge in encryption technology solutions by 2012.” 

Different approaches emerging

Sophie Stalla-Bourdillon, Senior Privacy Counsel and Legal Engineer at Immuta points to other progress on data institutions and data spaces. However, different approaches to data sovereignty are emerging, and the priority seems to be facilitating the sharing of data both internally and externally for the benefit of data consumers, even if the protection of data subjects remains a key component of the equation. 

For example, The International Data Space Association states that data sovereignty “means that these data holders can safeguard user data like never before and ensure that it is used only in accordance with strictly defined rules.”  But Stalla-Bourdillon cautions that it is still unclear whether the infrastructure requirements in specific efforts like the European Health Data Space will enable the exercise of the right to consent/object at the project level. 

Stalla-Bourdillon hopes that actual implementations can take advantage of innovations in federated data ecosystems and believes the GDPR should remain the foundational layer: 

Federated data governance is emerging and we know how to build effective traffic-light systems for complex data ecosystems which can address various data protection goals, starting with security. Technology, too, is changing in a way which helps organizations to take personal data sovereignty seriously: data security platforms with fine-grained access controls allow businesses to enforce governance at both a centralized and decentralized level to empower the domain level to embrace data privacy whilst also reaping the benefits of the safety net of a wider policy framework. The primary remaining challenge, then, lies in enforcing design requirements and eliminating infrastructure rents, which demands both resources and unwavering commitment.

She believes the right approach could increase individual agency and autonomy without necessarily hindering innovation, provided we agree that the ultimate goal should be promoting responsible and human-centric innovation. She says:

There is no reason why IT systems and underlying data flows cannot be made more transparent early on, enabling iterative assessment of the desirability of outputs, such as balancing efficiency wins with systemic risks and the impact upon the underlying infrastructure. 

A costly transition

Other experts are concerned that shifting the conversation to personal data sovereignty could also incur significant costs on business models and the technical infrastructure. Irina Tsukerman, president of Scarab Rising, a boutique security and privacy consultancy, argues:

Requiring personal data sovereignty as a priority would mean that most companies would need to completely rethink and restructure their entire business model, which is very unlikely. Even limited restrictions by EU towards US companies has led to profound economic losses, while on the other hand, risking the possibility of those countries losing access to social media networks and other services in those countries altogether. 

Tsukerman believes the only way a new business model would possibly work is if companies charged their customers for services, such as what X is attempting to do at various levels. Alternatively, they would have to find some other means to compensate for the loss of income in terms of advertising revenue, which constitutes most of the results from data sharing.

But  the transition may not be too costly for every large enterprise or platform. Anthony Cammarano, VP of Security, Privacy & Strategy, Protegrity, argues that large platforms like Apple are starting to take the lead with their walled garden for protecting online information: 

A tangible example is letting people choose to share their actual email or a token that represents their email with an app provider when they sign up. This starts to move the nexus of power and control to the consumers that allows us to make choices. Facebook, on the other hand, is tracking you, capturing thousands of data points about you, and selling that information even if you do not have a Facebook account. This seems wrong on many dimensions.

My take

Privacy gaps will continue to be a hot topic globally as businesses, hackers, and governments find creative new ways to misuse our data. As a society, we are still sorting out how regulations or new business processes add meaningful value to our lives. However, these are often not well received by enterprises or are poorly implemented. For example, the most significant immediate impact of GDPR was that sites were suddenly spamming me with new privacy alerts, asking me to quickly accept their tracking cookies or undergo a complex process to turn them off. 

Individual data sovereignty is even more compelling, but it certainly will be more complex to implement and manage. Poor implementation could result in an even worse user experience, with hundreds of strangers asking if they can use my data in this way or that, perhaps making it complicated to say no. 

But in narrow use cases with simple controls, individual data sovereignty could make a lot of sense to #acceleratetrust by driving the useful data pool for training better AI for all of us. For example, I can imagine people contributing their medical data to help people like them, with the assurance that it could not be used to increase insurance rates. Businesses might be more willing to share data about their equipment to improve efficiency for everyone if they knew it would not empower competitors. 

Over 5 million people have contributed their unused computer time to the SETI@home project to search for signs of extraterrestrial intelligence. The project's success led to the BOINC Compute for Open Science project, which allowed people to contribute computer time to over 30 scientific projects to study diseases and climate change and improve medicine. Contributing data could be even easier and more impactful with the right controls and safeguards.

A grey colored placeholder image