iPhone Error 53 – a study in bungled user experience, but great security

SUMMARY:

Error 53 blew up last week as an issue on iPhones where the TouchID has been replaced by non-Apple repair shops. The implications are huge for enterprise.

error 53Apple is in hot water. It relates to an ambiguous, but apparently fatal error that some iPhone users report when trying to upgrade to the latest system version, iOS 9. According to a report in the Guardian publicizing the phenomenon,

The issue appears to affect handsets where the home button, which has touch ID fingerprint recognition built-in, has been repaired by a ‘non-official’; company or individual. It has also reportedly affected customers whose phone has been damaged but who have been able to carry on using it without the need for a repair.

Upon installing iOS 9, these users faced a wholly nondescript message reading, “The iPhone ‘iPhone’ could not be restored. An unknown error occurred (53).” Worse, yet, there’s no easy way to get past it: the phone is seemingly bricked along with any unique and unbacked up data. This is a situation that could easily occur, like this case, where the user needed to get a repair done in a location where there is no official Apple presence. Hard to conceive in Silicon Valley, but highly probable in many other parts of the world.

Why Would Apple Intentionally Brick a Phone?

The details behind the story are traceable to the iPhone’s sophisticated hardware-based security. This is a case where Apple can be praised for doing the right, and perhaps only reasonable thing, but in the worst possible way. Although Apple hasn’t confirmed causality, it turns out, the error typically (always?) occurs on phones where the Touch ID home button has been replaced with an aftermarket, non-Apple-authorized facsimile. This may seem like an arbitrarily punitive response by a greedy company looking to maximize repair revenues, however when you consider the Touch ID security function, it’s entirely logical and a virtual requirement for Apple to assure the integrity of the hardware-based biometric security system that is the foundation of trust upon which its Apple Pay mobile payment platform is based.

Understanding why requires looking at the details of Touch ID’s implementation. The home button scanner takes extremely high-resolution pictures of a fingerprint, including “minor variations in ridge direction caused by pores and edge structures”. As Apple describes,

It then creates a mathematical representation of your fingerprint and compares this to your enrolled fingerprint data to identify a match and unlock your device. Touch ID will incrementally add new sections of your fingerprint to your enrolled fingerprint data to improve matching accuracy over time.

Here is where the iPhone’s hardware security kicks in. Instead of storing this mathematical representation, which to us sounds like a cryptographic hash, of your fingerprint as a password online in iCloud, Apple uses dedicated memory, called the Secure Enclave, built into each iPhone A-Series SoC.

Touch ID doesn’t store any images of your fingerprint. It stores only a mathematical representation of your fingerprint. It isn’t possible for someone to reverse engineer your actual fingerprint image from this mathematical representation. The chip in your device also includes an advanced security architecture called the Secure Enclave which was developed to protect passcode and fingerprint data. Fingerprint data is encrypted and protected with a key available only to the Secure Enclave. Fingerprint data is used only by the Secure Enclave to verify that your fingerprint matches the enrolled fingerprint data. The Secure Enclave is walled off from the rest of the chip and the rest of iOS. Therefore, iOS and other apps never access your fingerprint data, it’s never stored on Apple servers, and it’s never backed up to iCloud or anywhere else. Only Touch ID uses it, and it can’t be used to match against other fingerprint databases.

This explains why Apple effectively bans third-party fingerprint scanners on the iPhone. There’s nothing but Apple’s iOS bootloader preventing a rogue home button with embedded firmware from executing a Man-in-the-Middle (MitM) attack by creating a copy of the fingerprint representation before passing it onto the Secure Enclave. Of course, the attackers would need to reverse engineer Apple’s hash function (“mathematical model”), however with enough trial and error (remember, the Secure Enclave will have the valid copy of the hash output) it’s certainly possible. Having the digital version of one’s print would allow unlocking all kinds of things on the phone, including Apple Pay.

Mobile Payments: A Matter of Trust

Perhaps the most compelling feature of Apple Pay is the fact that it doesn’t store, nor use your actual credit or debit card numbers when making a transaction. According to Apple,

When you add your card, a unique Device Account Number is assigned, encrypted, and securely stored in the Secure Element … When you make a purchase, the Device Account Number, along with a transaction-specific dynamic security code, is used to process your payment. So your actual credit or debit card numbers are never shared by Apple with merchants or transmitted with payment. And unlike credit cards, on iPhone and iPad every payment requires Touch ID or a passcode, and Apple Watch must be unlocked — so only you can make payments from your device.

Should a rogue Touch ID sensor replicate the digital fingerprint model (hash), it could allow attackers to compromise the entire Apple Pay reservoir of device account numbers and create transactions unbeknownst to the iPhone owner. Since mobile e-comm sites and apps are now integrating Apple Pay into their checkout process, it would be relatively easy to remotely monetize compromised accounts without getting near an NFC PoS terminal. In this context, an Apple representative’s statement to the Guardian sounds much less capricious,

When iPhone is serviced by an authorized Apple service provider or Apple retail store for changes that affect the touch ID sensor, the pairing [beteween device and sensor] is re-validated. This check ensures the device and the iOS features related to touch ID remain secure. Without this unique pairing, a malicious touch ID sensor could be substituted, thereby gaining access to the secure enclave. When iOS detects that the pairing fails, touch ID, including Apple Pay, is disabled so the device remains secure.

My Take

Apple Pay, used by an estimated 10-20% of users with capable devices and supported by millions of stores, is North America’s most successful mobile payment platform. Yet adoption has been slow compared to other Apple services due in part to people’s unfamiliarity with and relative distrust of the technology. Aside from convenience, the fact that the system is far more secure than traditional payment methods is undoubtedly a key factor for many early adopters. Their trust in Apple’s security would be instantly undone if the Touch ID-Apple Pay system were compromised by rogue third-party hardware, damage that would jeopardize its roll out in China and other large markets.

And just think about the havoc an unprotected device might cause in a BYOD environment, as is becoming increasingly common. While it’s hard to say what kind of replay attack hackers might dream up, this could potentially impact any organization. Then there’s the potential for fraud against corporate cards stored on one’s device. It is therefore wholly understandable that Apple takes a brute force approach to the problem. But it went too far.

I applaud Apple for doing the right thing to protect its customers with robust security technology, but must take issue with the company for both their utter lack of communication about the necessity of authorized repairs for the Touch ID button assembly and the equally opaque error message presented to users should they use a third-party component. This is a classic case of nailing the product design, but bungling the user experience and presents a teachable moment for other organizations implementing sophisticated technology: fail gracefully when users do the unexpected and don’t leave them in the dark when the unusual invariably happens.

Image credit: Featured image via Lifehacker, main story screenshot from iTunes