With many enterprises still playing catchup on the requirements of the EU’s General Data Protection Regulation (GDPR) and gearing up for the upcoming California Consumer Privacy Act (CCPA), few topics are more top of mind for tech companies than data governance.
There aren’t many executives with more experience in the area than Barbara Lawler. Over the past two decades she has served as chief privacy officer for Intuit, Inc. and the Hewlett-Packard Company as well as led her own firm called Digital Stewardship Strategies where she guided organizations in setting strategies to implement ethical approaches to privacy and digital stewardship, focusing on machine learning and artificial intelligence applications.
In September 2018 she joined Looker, a fast-growing data analytics platform startup which was acquired by Google for $2.4 billion in June 2019, as chief privacy and data ethics officer. We spoke to Lawler about a range of data governance topics, including how the field has evolved over her career:
The tremendous acceleration of new connected technologies and the growth of social networks over the last two decades has led to a new consensus on data privacy they goes beyond check-the-box compliance and simple enforcement of the rules. We live in an ever-expanding data-first interconnected digital world. Just because something is technically feasible and legally compliant does not make it ethically and morally right, nor even effective in protecting the autonomy and privacy of people. We’re seeing a movement toward Privacy by Design, which means embedding data privacy requirements at every stage of product design and development. The idea is to build privacy in rather than trying to bolt it on later.
We asked Lawler about what advice she would give companies that were not affected by GDPR that now find themselves scrambling to meet the requirements of the California Consumer Privacy Law (CCPA) that goes into the effect on January 1, 2020. She told us:
The first thing I would say is that while CCPA was inspired by some of the components and principles of GDPR, it is definitely not GDPR or a de facto GDPR for the United States. For companies that have dealt with GDPR though, it does lay the groundwork for certain underlying governance principles that also apply to CCPA: What data do you have? Why do you have it? How are you using it? Where is it? Where is it going? How long are you keeping it and/or when are you deleting it?
It sounds relatively simple when you describe it that way, but it can be tricky for large companies, even smaller companies, who don’t have a good, grasp of what I call their data supply chain because that invokes the idea of a chain of custody—from the point it comes in, all the different places it’s going to go, including any third-party providers and then where it goes from there. Each step of the way needs to have appropriate levels of monitoring and control and encryption if necessary.
She acknowledges there are some areas where CCPA may pose difficult challenges for companies.
One of the fundamental ideas is that an individual has the right to know what personal information the company has about them and where else the data has gone from there. It focuses heavily on whether the data is being sold. There are many companies that are incredibly responsive and ethical and accountable who say “We don’t sell our customer data. We don’t even sell it in aggregated form.” But, if you look at the definition of selling under the act it basically encompasses any kind of business relationship and transaction you can think of. So that should give companies pause. They have to parse through the relationship with service providers or third parties to make sure they know what the ultimate destination of the data is going to be and whether that process will be defined as selling.
Since Lawler’s job title involves both privacy and ethics, I asked her if that were one job or two.
I think they’re overlapping circles in a Venn diagram. Information privacy and protection are fundamental throughout. Policy and governance apply to both. There is a need to have a tight grip on security, within the company and with our relationships with customers.
Data ethics is not about what we are required to do but what we should be doing. What is the right thing to do? And even to challenge the way Looker technology is being used or might be misused. A lot of companies shy away from that question but Looker doesn’t.
What kind of questions come up and how do you deal with them?
For example, how do we feel about certain political campaigns using our technology? How do we feel about certain organizations that may have had labor issues in the past? In order to understand what they plan to do with our tools and whether it is consistent with our values, we put together a set of about 10 evaluation questions based on what can see in the marketplace where there have been concerns about certain industries or products.
That doesn’t mean we say no we won’t do business with that organization but that we have a thoughtful conversation with an internal review board that has executives that represent diversity and inclusion as well as people who have specific experience with the issue at hand.
We make a recommendation, but the final decision is made by our CEO.
Finally, I asked where she thought the whole new environment of aggressive data regulation is going.
I hope we find a path that blends regulations, individual rights, common sense, and data ethics together for a more balanced and less heavy-handed approach. What I’m afraid of is that we will take a path that leads to more costly, complex, and restrictive procedural compliance. One thing for certain, we’re still going to be talking about this five years from now.