Is an iPhone Backdoor Key Really More Dangerous than Other Sensitive Information?

by Michael C. Dorf

Nearly four years ago, the government sought to compel Apple to provide assistance in breaking the encryption of an iPhone. Apple resisted on legal and policy grounds. I analyzed Apple's legal argument at the time and concluded based on a SCOTUS precedent construing the All Writs Act that Apple would probably lose. I did not at the time address Apple's policy argument. I wrote:
Apple argues that orders such as this--that Apple "hack" one of its customers' phones--will, in the long run, do more harm than good. Apple and its various defenders across the tech and civil liberties world argue that a technology developed for the laudable purpose of breaking encryption on a terrorist's phone could leak into the hands of hackers and other bad actors (including other terrorists). In other words, Apple is not simply saying that privacy should prevail over security (although it is certainly saying that pretty loudly), but also that this sort of order would undermine security.
The 2016 impasse between Apple and the US government was obviated when the government cracked the security of the iPhone in question using the assistance of a third-party firm, but since then Apple has improved the iPhone's security, so the government is once again seeking Apple's aid. Apple has apparently provided iCloud backup material but once again resists creating a backdoor key for its phone on the ground that it could fall into the wrong hands.

When discussing the matter in 2016, I confessed that I did not have "a well-informed view about the merits of" the privacy policy question. I still don't, but that won't prevent me from offering a thought about the core risk here. The thought--which I'll briefly elaborate below--is that the risks posed by Apple's cooperation here do not differ in kind from other risks that sensitive information might be lost or stolen.

The idea that a backdoor key is inherently risky depends on some key assumptions. To unpack them, let's begin with a simple-minded analogy. Suppose that Jim owns a house and hires the Acme Lock Company to install a strong lock on the door to his house. Acme provides Jim with a key that cannot be duplicated by a hardware store or ordinary locksmith, not simply because the key says "DO NOT DUPLICATE" on it but also because it uses some new proprietary technology. Nonetheless, the key and locking mechanism are purely mechanical, not electronic. One day, Jim loses the key. He goes to Acme and asks for a duplicate. Let's suppose that Acme's engineers could create a new key based on Jim's lock, but that doing so would be complicated. Should Acme worry that if it creates the new key that will endanger other customers on the ground that now Acme locks are less secure? Maybe.

To be clear, the worry is not that the physical key to Jim's house itself will be duplicated. Jim will receive the only new copy. And Jim's key won't open Acme locks on other houses. However, the worry is that the technology that Acme creates to essentially pick Jim's lock will escape from Acme's engineers and then be used by bad actors. Is that a serious worry?

Perhaps, but one would think that Acme itself can take precautions against its new key-duplicating technology escaping into the wild. It can screen its engineers very carefully and limit the access that others have to the technology. Yes, there are risks of a security breach, but that's true with all sorts of sensitive data and products created by private companies, government actors, etc. Military contractors, government agencies, and others invest in security to prevent such breaches. Sometimes these protective measures fail. But it's hardly clear that the risk of a burglar getting access to Jim's house is greater than these other sorts of risks.

So much for Acme and Jim. What about Apple and those of us who use iPhones? It is tempting to distinguish Jim's physical lock from my iPhone by pointing to the fact that Apple's backdoor key is likely to be software that is easily duplicated. And there is something to that distinction. We might imagine that Acme uses some large machine to create duplicate keys, so it's difficult for a rogue employee to smuggle the machine out; by contrast, a rogue at Apple could simply copy the relevant software to a thumb drive or the like. But that's also true of lots of other sensitive data and programs. The fact that the iPhone backdoor key is software (let's assume) makes it more vulnerable to escape or theft than is Acme's hypothetical large machine, but that fact doesn't make the backdoor key more vulnerable than other collections of ones and zeroes.

One might also worry that each time Apple shares a backdoor key with the FBI, it creates more opportunities for breaches, because more people have access. That's true, but we can think of ways around that problem. One solution would be for Apple not to give the key to the FBI but to maintain control over it and provide the FBI only with the recovered data from the iPhone. To be sure, that would create admissibility issues for the evidence taken from the iPhone, but in many instances the government will not be interested in introducing evidence taken directly from the iPhone; instead it will be interested in using that information to develop leads.

Accordingly, I reach the tentative conclusion that the direct risks to the security and privacy of iPhones from the development and use for US law enforcement of a backdoor are substantial but not necessarily more substantial than the other sorts of security and privacy risks that companies like Apple try to manage.

Nonetheless, Apple has good reason to worry about cooperating with the US government, because Apple does business in other countries as well. Although a court order that complies with the Fourth Amendment might be sufficient to satisfy privacy concerns implicated by Apple's creation of a backdoor key for an iPhone in the US, in other countries in which Apple does business, the authorities will be able to legally compel Apple to create a backdoor key without respecting privacy. That risk is especially acute in authoritarian countries like China (where the information sought may be used to uncover political activism rather than crime), but it is also a problem even in constitutional democracies with less robust search-and-seizure protections than the US. Apple might not want to cooperate with even a lawful order of a privacy-sensitive US court for fear that doing so will open it up to having to comply with orders from less privacy-sensitive jurisdictions in other parts of the world.

If that's the worry, however, I don't see how Apple can possibly win this battle. Apple doesn't have a choice whether to comply with a US court order, and likewise it doesn't have a choice whether to comply with orders in other countries in which it does business. We still live in a world of geographic sovereigns. If China or any other country tells Apple that the cost of doing business in country is compliance with local law, then regardless of how privacy-invasive or otherwise oppressive that local law is, Apple will just have to make a decision whether it is willing to pay that price. For the most part, the answer given by US tech companies--even Google with its onetime "Don't be evil" motto--will be to capitulate to bad laws in order to make a buck.