Friday, February 19, 2016

Apple, the FBI, and the All Writs Act

by Michael Dorf

Apple's resistance to the order directing it to develop software that could circumvent the encryption* on the iPhone of deceased San Bernardino killer Syed Farook cites two main objections. First, on policy grounds, Apple argues that orders such as this--that Apple "hack" one of its customers' phones--will, in the long run, do more harm than good. Apple and its various defenders across the tech and civil liberties world argue that a technology developed for the laudable purpose of breaking encryption on a terrorist's phone could leak into the hands of hackers and other bad actors (including other terrorists). In other words, Apple is not simply saying that privacy should prevail over security (although it is certainly saying that pretty loudly), but also that this sort of order would undermine security. I don't have a well-informed view about the merits of this argument, so I will leave it to others.

Apple's second argument is more legal in nature, and so I will focus on it. Apple argues that Congress has not legislated a requirement that makers of phones, computers, and other electronic devices that use encryption build in a "back door" that allows the government to circumvent encryption. Under the circumstances, Apple contends, reliance on the All Writs Act (part of the Judiciary Act of 1789) is an overreach. Is that right?

The short answer is no. As a general matter, it is perilous to infer anything about the state of the law from congressional inaction, but here we do not need to rely on that general proposition. There is case law on point.

The leading SCOTUS precedent is United States v. New York Telephone Co., decided in 1977. There, the FBI sought the assistance of a telephone company in installing a pen register--a device that records the phone numbers called and from which calls are received--on the line of a suspect under investigation. The phone company provided some but not all of the assistance the FBI requested, resisting primarily on the ground that the demand for assistance violated another federal statute limiting wiretapping. The Supreme Court rejected this argument.

But where did the government even get the affirmative power to compel the assistance of the telephone company? The Court cited two sources. First, Federal Rule of Criminal Procedure 41 authorized a search warrant based on probable cause, so the government was entitled to attempt to install the pen register. (The Rule linked above has been amended since 1977, but not in a way that would render it inapplicable to the government's proposed "search" of Farook's iPhone.) What gave the government the power (after successful application for a court order) to compel a third party not itself suspected of criminal activity to provide affirmative assistance? The Court said that this power was conferred by the All Writs Act:
The power conferred by the Act extends, under appropriate circumstances, to persons who, though not parties to the original action or engaged in wrongdoing, are in a position to frustrate the implementation of a court order or the proper administration of justice . . . and encompasses even those who have not taken any affirmative action to hinder justice.
Note that Apple could be said to fall within the core of that statement if one regards the encryption of iPhones as an "affirmative action" that hinders the FBI's efforts, but even if not, the language makes clear that Apple--like New York Telephone before it--could be compelled to assist the FBI. Apple's argument that Congress needs new authority to require it to assist the FBI thus appears to be wrong.

New York Telephone recognized that there could be limits to the government's authority under the All Writs Act to compel complete strangers to a case to assist the government. Presumably, if the San Bernardino killer had owned an Android phone, the FBI couldn't have compelled Apple to provide assistance hacking it simply because it thought that the Apple engineers were better than those employed by Google. But the case involves an iPhone, and so what the Court said about New York Telephone nearly forty years ago seems equally applicable to Apple: "we do not think that the Company was a third party so far removed from the underlying controversy that its assistance could not be permissibly compelled."

Magistrate Judge Pym's order provides Apple with one possible out. She gives Apple the opportunity to demonstrate that compliance would be "unreasonably burdensome," which is pretty much standard language for escaping a disclosure duty. As I understand the technological issues, Apple is being asked to create a special-purpose iOS that removes the password protection upon installation. Apple's public statement does not claim that this is an especially difficult engineering task for its programmers. Rather, Apple's concerns are, as noted above, for the privacy and security of the users of its products more generally.

Does that count as a burden? I very much doubt that the courts will say so, but maybe Apple's argument here is not quite as bad as it at first appears. Apple says that its customers depend on the built-in security features of the iPhone for their security, adding that under the order, "[t]he same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe."

One might read this objection as invoking something akin to a smartphone maker-customer privilege. Suppose that instead of seeking assistance from Apple in decrypting an iPhone, the FBI were seeking the assistance of a doctor in performing an execution or in developing some mechanism for defeating a patient's interests. Then we would say that the government should not be permitted to impose on a doctor an obligation that is fundamentally inconsistent with her professional medical duty to her patients--at least absent a demonstration by the government of a truly compelling need.

There is a key difference here, of course. State and federal law generally recognize a doctor-patient privilege, whereas there is no manufacturer-customer privilege. But maybe that oughtn't to be controlling. After all, under federal law, the courts can recognize new privileges without any new legislation (as when the SCOTUS recognized a therapist-patient privilege in Jaffee v. Redmond). And Apple wouldn't even be asking for a full privilege against turning over existing information--just the right not to be made to develop a new tool for undermining its customers' privacy and security.

To be clear, I think Apple will likely lose this fight--at least given the case law we have. A functional Congress could well decide that permitting the sort of court order at issue in this case does more harm than good, but then, a functional Congress would do a lot of other useful things too.

* Throughout the foregoing post, I refer to "encryption," but the feature at issue is a very primitive version of encryption. After ten unsuccessful attempts at entering a password, iOS wipes the iPhone's memory. The FBI is seeking code from Apple that will eliminate the ten-attempt limit, so that it can unlock Farook's iPhone by guessing all 10,000 possible 4-number passwords. (It's an iPhone 5c, which lacks fingerprint recognition.)


Edward Keller said...

Professor Dorf,

Wouldn't the SCOTUS case of Riley v. California control if Apple appealed?
As I'm sure you're well aware, in that case the Court held that the pictures in Riley's phone were inadmissible after the police conducted a warantless search of his phone and found those pictures to be incriminating evidence of gang activity and violated his Fourth Amendment rights.

Greg said...

Working in this field, I suspect there is more "we can't do this" going on than Apple lets on. Apple tried that, and was refused. It also looks bad to publicly say that your engineers CAN'T do something.

I suspect that, right now, Apple really doesn't know how to do this. Any secure system has to, at a minimum, not allow its software to be altered without first establishing privileged access (in this case, unlocking.) As such, if Apple finds a way to unlock these phones, they will be doing so by finding some flaw in their security, and exploiting it. This isn't a matter of a simple software reload.

The problem is, this exploit a one-time thing. As soon as Apple finds the flaw, the only responsible thing to do is to immediately patch the flaw in all existing phones (if possible) or to fix it in the next iPhone.

I think there's a larger precedent that Apple is concerned about. Developing a new exploit is expensive. It might cost Apple hundreds of thousands of dollars to develop the exploit to break this one phone. That might even be considered reasonable in this case. However, that's the cost to break THIS phone. It costs another hundred thousand dollars to crack the next phone, because the way they break into this one no longer works. That adds up fast.

The result is that the only economically feasible way to respond to a barrage of requests like this is to stop building secure systems and start intentionally building exploits into their products. THAT is an unreasonable burden on Apple and the entire industry.

Apple isn't the only company building secure systems, and there are secure systems that protect data a lot more valuable than that in iPhones. The whole design of these systems is that no one, not even the company providing the hardware, knows how to break them. If a precedent is set that this data MUST be accessible to the company providing the hardware, that's a huge impact to the industry. In addition to hurting customers (including, BTW, U.S. government customers,) it makes U.S. products impossible to sell in markets like China because they would be essentially required (by the courts, not the legislature) to have U.S. government back-doors.

Kilo said...

Is it possible for the unreasonable burden to be a business concern? If, for example, customers rely on Apple specifically for its security, and it will cost the company a great deal of customer goodwill and many future sales to comply with the order, would that be a legally-relevant burden? Because it would seem that, if so, customers can literally make this use of the All Writs Act illegal by claiming these things frequently on social media, which Apple could then cite in court. I would be tickled by the possibility that slacktivism could directly change what the law requires.

Richard Stern said...

Re the first comment, Riley v Calif may be beside the point. If all it holds is that the exclusionary rule against wrongful searches prevents use in court of the evidence wrongfully seized, that may not matter to the FBI. The FBI is certainly not interested in using the evidence to prosecute the late Farouk (or perhaps even his email correspondents). The FBI may well be interested only in preventing future mass murder attempts that the evidence discloses or helps it to frustrate. Does Riley speak to that?

David Schwartz said...

I am not a lawyer - so forgive my ignorance here. Did the NY Telephone decision refer in any way to Ny Telephone's position as a publicly-regulated utility (which I think it was?) If it was not - my mistake, of course, but if it was, and the decision was based in part on that fact, then how might this apply in the Apple decision?

James Longfellow said...

I don't believe that the 1977 case is relevant to Apple's situation as it is easily distinguishable on the facts. The key question is whether or not Apple is in "a position to frustrate the implementation of a court order or the proper administration of justice." The answer to this question is clearly no.

By refusing to write new software Apple is not frustrating the FBI. The FBI is frustrated by its inability to search the phone based upon software written /prior to/ the court order. The security on the iPhone existed long before the court ever issued its order and will exist long after the order is no longer valid. The 1977 case does not actually impose a lawful duty to take any affirmative on behalf of the FBI. It only requires the third party to refrain from frustrating a lawful order. That's the word, frustrate.

That reading makes sense because it is the actual fact pattern in the 1977 case. There the telco was refusing to cooperate with the FBI in the present. There was no prior existing technology that was hindering the government. It was the telco's decision after the lawful court order issued that was the frustrating the government.

In the 1977 case it was the telco company's behavior AFTER the lawful court order that frustrated the government. With Apple, it is the company's behavior BEFORE the lawful court order that is frustrating the government

To argue otherwise--on these facts--is to make the claim that a person can be in a position to frustrate someone who is already frustrated. That is grammatical nonsense.

Edward Keller said...

Mass murders could, and have, occurred without the technology in question.
ALL of law enforcement seeks to prevent these types of crimes from happening again.

All I'm arguing is that the courts could rule the FBI request to be unlawful

t jones said...

I'm with JL, there's a difference between not interfering with the the government doing something and being required to do it oneself.

PenguinBelly said...

@Kilo: I do not believe that justification is going to fly. It would be the same justification that would allow a "Swiss Bank" in the U.S.