Main content

Apple, FBI and encryption - four issues enterprises should care about

Jon Reed Profile picture for user jreed February 23, 2016
Updates on the Apple versus FBI encryption fracas are proliferating at a ridiculous rate. I've boiled down four enterprisey takeways, along with a collection of the best links to date.

Even though I expected encryption to take center stage in 2016, I did not anticipate the perfect storm that has resulted. The FBI's pursuit of phone data from one of the two deceased terrorists involved in the San Bernardino attacks has ratcheted up the stakes on all sides. A PR skirmish has ensued, with everyone from presidential hopefuls to Bill Gates weighing in.

Apple now vows to appeal a court order requiring the company to comply with a Justice Department request to bypass security functions on the phone. Poring through the chaos of articles has raised more questions than answers. To save you the rummaging, I've organized key points and enterprise takeaways.

1. The iCloud data debate underscores critical points on data security and phone-based data scraping. The FBI, San Bernardino County, and Apple are engaged in a PR battle over who screwed up the ability to access the terrorist's data on iCloud. What all parties agree upon is that the iCloud password was reset in the hours after the iPhone was obtained. When it comes to pulling data, resetting this password was a mistake of either large or epic proportions.

The FBI ultimately published a statement acknowledging responsibility for resetting the iCloud password. (If the iCloud password had not been reset, it's possible a "forced" backup to iCloud could have been initiated from a previously visited wifi location, in theory, letting Apple off the hook to crack into the phone itself). The FBI maintains the phone itself might contain sensitive information that is not backed up to the iCloud. Examples could include Telegram chats, an encrypted messaging program. This piece from Ars Technica confirms that theory's viability.

Enterprise takeaways:

  • Cloud data security remains an issue requiring vigilance and customer transparency. Should we be comfortable that the phone's data is encrypted, but the iCloud data is, in theory, more easily accessed? Granted, phones are (usually) more insecure than cloud data centers. In this case, the cloud data is perceived by all parties as the weakest link.
  • When planning for phone data security, messaging systems outside the scope of corporate control or data wipes should be considered. Evaluate the risk of employees messaging confidential information outside the purview of monitored applications. Educating employees on which applications and password protections should be used to ensure data security is crucial.

2. The best information comes from security experts. The tech press is loaded with mediocre encryption stories. Some of the best comes off the beaten track. Example: iPhone encryption expert, Jonathan Zdziarski, who penned:

On FBI’s Interference with iCloud Backups - more on iCloud - more on the iCloud aspect of this debate, raising the possibility the FBI may plan on requesting a second action from Apple.
Apple, FBI, and the Burden of Forensic Methodology - deeper context on the forensics of data recovery, and why the FBI is not asking for a phone data dump, but an instrument that can be re-used.
On Ribbons and Ribbon Cutters - which argues that the FBI is looking for a "ribbon cutter," a re-usable tool to break into iPhones. Key quote:

If FBI were simply asking for Apple to cut the ribbon, all future AWA orders would have to go through the same legal scrutiny in the courts for justification. Getting the ribbon cutter invented for a terrorism case opens the door for such a tool to then be justified by the DA for weaker cases – such as narcotics, computer crimes, or even simply investigations where the government can’t even prove to the courts that a crime was ever committed. Once it’s a tool, just like a Stingray box or a breathalyzer, the court’s leniency in permitting its use increases dramatically.

Enterprise takeaways:

  • Expert content still requires digging to find  - but it must be done if you want the real deal. Don't trust social streams to always surface the most informed security views.
  • There is no substitute for device-specific security. Understanding the security differences between different versions of iOS and Apple hardware are amongst the crucial details.

3. This is not about solving Apple's encryption, but about solving the password protection failsafe. But that failsafe is a backdoor. Tricky stuff - the FBI doesn't want Apple to decrypt the phone, simply to provide a tool that will bypass the password protection mechanism, a setting that prevents the FBI's "brute force" software from attempting every password combination. But security experts say that such a bypass is, by definition, a backdoor. And once a backdoor is established, it can be exploited by bad actors.

4. The FBI doesn't necessarily need Apple to get into this particular phone. Critics of the FBI argue the true agenda is to establish legal precedent and obtain a tool that can be used to unlock other iPhones. News has come out indicating that the U.S. government does want to get into other devices. Other governments and agencies might have a "zero day" program that could be modified to break into this particular phone. It might not be easy, and it might be resource-intensive, but it's been done before. As Zdziarski writes:

In spite of all of iOS 8’s security, the Chinese invented a ribbon cutter for it called the IP BOX. IP BOX was capable of brute forcing any numeric passcode in iOS 8, and even though it was junky, Chinese-made hardware with zero forensic credibility (and actually called home to servers in China), our government used it widely to break into iOS devices without Apple’s help. The government has really gone dumpster diving for forensic solutions for iOS. This ribbon cutter was used by both law enforcement and anyone with $200 to break into iOS devices, and is a great example of how such a ribbon cutter is often abused for crime.

My take

I hope it doesn't sound like I am unsympathetic to law enforcement's position on accessing data. There are times when criminal or terrorist activity justifies access to data records. But as I wrote in New study refutes encryption alarmists – but with a concerning twist, there are plenty of privacy-encroaching ways for law enforcement to monitor whoever the heck they please.

The most dangerous terrorists won't make the mistake of leaving a trail of damning clues on their smart phones. The obsession over allowing smart phones to be cracked is more likely to compromise all of us than capture the worst of us.

I am less interested, however, in taking a position than I am in encouraging readers to take the time to learn the different issues in play. We must all become security-savvy in order to not only protect our own data, but to be responsible voters and citizens.

Apple is defensive that their tactics are just a marketing stunt. But it is in their commercial interest to protect consumers' data, both here and in China. Having a commercial interest does not invalidate Apple's privacy ideals, but it's a factor. I hope for Apple's sake they are indeed doing all they can to help the FBI as they claim, because a legal precedent could go against them, and render a bigger blow to the tech industry's attempt to protect our data.

The PR war is far from over. Apple has done a good job articulating their viewpoint, but the issues they are defending are complicated, and the fear of terrorism is primal. This issue seems to be dividing the U.S. electorate and who knows, it may end up being a defining issue in the presidential election as well.

Update, 8:00am GMT 2/24: article edited for clarity, no points removed or altered. Clive has raised a good point in the comments section.

A grey colored placeholder image