Just in recent weeks, we have:
- Yahoo getting pantsed by hackers so many times it's difficult to keep track of all the breaches. Finally the U.S. Senate steps in to demand explanations.
- The unsettling knowledge that hacking isn't limited to rogue individuals, but also includes nation states. I won't touch the U.S. election allegations here, but we can point back to the Sony Pictures hack by North Korea for another disconcerting example.
- The strange "Cloudbleed" leak which has spewed sensitive data like encryption keys and passwords from millions of sites served by Cloudflare. Some of that compromised info is now indexed by search engines on the open web.
- The revelations about CIA hacking activities via Wikileaks, revealing a sophisticated array of CIA programs designed to breach individuals' phones, televisions, and computers - basically anything connected to the Internet. Companies like Apple were compelled to explain how they've responded to the exposed vulnerabilities (Apple has patched critical iPhone exploits mentioned in the Wikileaks CIA dump).
- Data from Amazon Alexa gets implicated in a murder trial, with Amazon ultimately agreeing to hand over the data - though only because the defendant gave permission to hand the data over, otherwise Amazon was prepared to legally contest police efforts. This case has led to some complicated pieces on exactly what data Amazon and other devices are recording in the home, and how that data could be breached.
- FBI Director James Comey also went on record as saying "There is no such thing as absolute privacy in America," which may have always been true, but has rarely been expressed so blatantly. Fourth amendment buffs weren't pleased. I've already explained why Comey's war on encryption technology is ineffective and shortsighted, but his comments certainly indicate where we find ourselves now.
Though I believe privacy is an artifact, it's irresponsible to portray citizens as powerless to get control over their data. Control is a matter of degrees, not an absolute. It's worth the vigilance of plugging as many holes as possible. Helpful guides to secure your data and devices ever everywhere, such as this piece from the New York Times which was issued after the CIA revelations. Also check The Guardian's Eight things you need to do right now to protect yourself online.
The perilous state of privacy has drawn some heavy hitters. Enter Tim Berners-Lee in I invented the web. Here are three things we need to change to save it. Lee points to three critical problems that endanger the Internet, one of them being data privacy. Or, the implications of giving up so much data for "free" services. Under the heading "We've lost control of our personal data," Lee writes:
Through collaboration with – or coercion of – companies, governments are also increasingly watching our every move online and passing extreme laws that trample on our rights to privacy. In repressive regimes, it’s easy to see the harm that can be caused – bloggers can be arrested or killed, and political opponents can be monitored. But even in countries where we believe governments have citizens’ best interests at heart, watching everyone all the time is simply going too far. It creates a chilling effect on free speech and stops the web from being used as a space to explore important topics, such as sensitive health issues, sexuality or religion.
I have some twenty-something friends who are oblivious to the risks of the data they freely share across services; many are even careless with their phones when they are out. But others are conscious of the privacy tradeoffs and have made a conscious decision to live openly. That's where most of us are headed:
- It's difficult to succeed professionally without posting public info on sites like LinkedIn. Those without social profiles are often viewed suspiciously. Depriving yourself of public or semi-public networking puts you at a disadvantage. We learn in the context of our networks.
- Without smart phones and Internet usage, we lose the edge of convenience technology provides. Some of us become quite dependent on it - e.g. shopping apps that know our preferences, advance order and instant pay at Starbucks, inviting Alexa/Google into our homes. We're (often) willing to share personal data with these companies in exchange for the benefits (e.g. Google Maps knows where I live and where I like to frequent).
Enterprises and data - earning customer trust
Enterprises are faced with their own privacy dilemmas. How much customer data to share with partners? How to store the data? What privacy assurances are offered? As we transition to "data-driven" business models, those questions are urgent. If botching privacy gets you bad press, then doing data privacy right can be a competitive advantage.
One of my local Verizon stores sent me a group text message promotion. I was on a thread with ten other people/phone numbers I don't know. No apology was issued upon complaint. I got my new phone from the other Verizon store, driving out of my way to do so. Winning trust through a transparent approach to privacy is the way forward. Protecting customer data while sharing it or using it for personalized services isn't easy. The good news: most consumers aren't privacy absolutists. They just want to opt-in to what you're doing.
In How to Use & Share Customer Data without Damaging Trust, Steve Shoaff, Chief Product Officer with Ping Identity, lays out these guidelines:
Be transparent. "Set the tone with customers early and be clear about your privacy policies and practices." Make it easy to find this info on your web site and emails, and explain it in plain language.
Go beyond the regulations. "A lot of companies will have privacy policies that adhere to regulations but don’t have strict data policies that satisfy customer needs." Adhering to regulations is the wrong thinking. Privacy-as-advantage means going beyond that - giving users plenty of chances to opt-in or out.
Put users in control. "Collecting customers’ digital identities and affiliated data requires robust and granular data management technologies and practices. It will only work if users can easily view and change their preferences about what types of information they want a company to have and what to keep private."
Be careful with third parties. "Companies are increasingly sharing data with third parties including advertisers, service providers or partners who provide adjunct services and products." That's problematic - some of the worst data breaches happen via third party exposure. Limit data sharing with partners to the bare minimum consumers have green-lighted. Don't go further than a consumer would expect. Example: going from scanning a badge to sharing that scanned data with a partner is a step too far. A badge scan is understood to result in an email from you, but not from your partners.
Use security best practices. "Privacy and security go hand and hand; employing the strongest possible security methods is crucial. Don’t just encrypt at the endpoints, encrypt data end-to-end, where it’s stored, while it’s in transit and when it reaches its end-use point. LinkedIn learned this the hard way last year after attackers were able to steal and fairly easily decrypt data from 100 million members."
I'll add: don't take the stance that your data is unbreachable. Instead, tell customers/partners exactly where it is stored and link to those privacy policies as needed. People like to know where their data lives and how it is used; they can make informed choices from there. Obviously for international companies there are loads of nuances, including the implications of U.S. policy changes. We've covered this in detail on diginomica. For the latest, see Stuart's Can Privacy Shield survive another Executive Order from Trump?
Final thoughts - IoT thickens the plot
There is also the pursuit of IoT and its impact on data practices, as many new and potentially hackable devices go online. Check my piece Internet of Things security - six issues for enterprises to reckon with for more on that. One big takeaway is: design for security. Have security experts on hand early in the design process to identify the layers of end-to-end security needed for each app.
Data privacy as a competitive advantage requires experience design skills. Getting folks to click "accept" on terms of service is the easy part. Designing your apps in such a way that privacy options are easily understood and adjusted is the hard part. Security and user experience don't have to be at odds, but they will be if the log-in experience isn't well thought. New biometric scans can help there.
Companies should consider taking public stances on data privacy - and influencing legislation/policy. Many tech companies have at least tiptoed into this fray. Yes, there is self-interest at play. That's doesn't mean it's not a good stance. It's hard to earn customer trust while sitting on the sidelines.