We are approaching a sea change when it comes to how personal data is created, stored and repurposed. Recent cases between state, local, and federal governments and companies like Apple and Facebook are breaking new ground in the battle between legitimate law enforcement surveillance and user privacy.
While the FBI and other law enforcement agencies can seek court orders compelling companies to comply with wiretap orders, at least two issues make it harder for government agencies to get the data they’re seeking in cases that are likely to arise:
- Rapidly changing technology. Law enforcement officials say they have been left behind by rapid changes in communications technology. To intercept the content of communications being sent in real time, investigators use laws that limit their reach, such as the 1994 Communications Assistance for Law Enforcement Act.
- Increasingly sophisticated encryption. Encrypted devices, such as iPhones, scramble data using a code that can only be opened with a special “key.” Often, the hardware itself serves as a unique key. This means that without a specific iPhone, companies may not be able to provide law enforcement with the data sent on their networks, and phone companies may not be able to provide that data to law enforcement without having possession of the specific iPhone with the hardwired key.
We are getting to the point where encryption is a permanent reality. Apple’s iPhones are encrypted, as are its iMessage and FaceTime video services. The mobile messaging application WhatsApp now encrypts all messages by default. In fact, the government and law enforcement encourage the creation of strong encryption protocols precisely because these protocols help secure government facilities and government computers against intrusion.
Yet, the rise of sophisticated encryption technologies on consumer-grade devices is creating an environment of “unbreakable security.” We must embrace the reality that no matter what law we pass or what court judgment we receive, it will become increasingly difficult to retrieve the content of electronic communications—both for legitimate and illegitimate purposes. Rather than trying to slow or prevent court action, legitimate surveillance operators need to be able to do their job even in the absence of collecting the content of communications.
Despite highly securitized and encrypted communications, there remains much digital content that is not the substance of a communication that can be used to create actionable material for both law enforcement and intelligence.
Data at Rest Vs. Data in Motion
The encryption debate is bound up by questions of the data’s physical location. The Communications Assistance for Law Enforcement Act does not make a telecommunications provider responsible for decrypting information that is not in its possession (i.e., “at rest” on its servers). Therefore, data that is in motion (i.e., traveling from one service provider to another) is less amenable to a valid Fourth Amendment search.
Searching data in motion is also practically problematic since it can leave the bounds of the United States altogether. U.S. laws are not applicable overseas—an entire network of legal and regulatory frameworks exist that bind law enforcement action in certain municipalities. Many European countries, for example, is a particularly challenging regulatory environment for law enforcement.
Even data at rest can be transient. Programs like Snapchat and Telegram allow users to “self-destruct” messages, photos and videos after a specified time period. This can mean that the social media or messaging company also deletes the relevant data from their servers, often permanently.
The FBI’s Arguments against Apple
The fast pace of technological advancement is also creating a situation that courts, with their relatively slow working pace, are unequipped to handle. The recent case between Apple and the FBI was indicative of the court system’s wider inadequacy in handling these issues.
In the U.S. District Court for the Central District of California, the FBI sought to compel Apple to disclose the contents of the phone belonging to the San Bernardino attacker. While the FBI could certainly have obtained the data normally, Apple’s new encryption technology meant that the FBI needed the attack’s phone password to do so (and only a certain number of failed password attempts are allowed before the phone erases all of its content).
At issue was whether the government had the legal authority to order Apple to unlock an iPhone. The facts were clear: the phone was owned by a government entity (the attacker’s employer); there was a valid warrant for the phone’s contents; and there was overwhelming evidence to convict the perpetrator.
The FBI called upon the “All Writs Act” (28 U.S.C. Section 1651), essentially asking the court to order Apple to create a new operating system (OS) that would allow the San Bernardino attacker’s phone to be unlocked. The All Writs Act allows federal courts to “issue all writs necessary of appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.”
The FBI argued that because Congress had not passed a law that specifically dealt with the FBI or other federal law enforcement agency asking a company to install a hacked version of its own OS that bypassed lock-code restrictions, the FBI relied on the All Writs Act as the basis of the court’s power to make Apple comply.
Apple hit back with a series of complicated and largely untenable legal arguments. The first among these was a reliance on the Communications Assistance for Law Enforcement Act (CALEA), which asks telecom service providers and equipment makers to assist law enforcement with surveillance, under proper conditions (47 U.S.C. Section 1002).
In a convoluted and unconvincing argument, Apple noted that the government could have passed a law called CALEA II that would have required technology companies to install back doors in their products. Tech companies hated CALEA II as it was being debated, and the Obama administration dropped it. There was a subsection in CALEA II that said the government could not force equipment manufacturers into specific designs.
Apple argued that by forcing it to write a brand new OS to hack the San Bernardino attacker’s phone, the FBI was asking for that specific design change. If Congress wanted the government to have this power, it would have passed CALEA II. Therefore, it was inappropriate for the courts to give the government this power using the All Writs Act.
Apple’s second argument was less convoluted. It cited United States vs. New York Telephone Co., in which the Supreme Court held that the government could use the All Writs Act to compel New York Telephone Co. to install a device that recorded the phone numbers dialed on a pair of phones suspected of being used in criminal activity only if: 1) the phone company wasn’t “so far removed” from the case; 2) the help required by law enforcement was “meager;” 3) the phone company was a public utility; and 4) federal law enforcement tried to obtain information on its own but could not accomplish surveillance without help.
Apple argued that they were we far removed from the case: “Apple is a private company that does not own or possess the phone at issue, has no connection to the data that may or may not exist on the phone, and is not related in any way to the events giving rise to the investigation.”
Apple also argued that it was not a public utility: “Apple is a private company that believes that encryption is crucial to protecting the security and privacy interests of citizens who use and store their most personal data on their iPhones.”
Additionally, Apple argued that the help the FBI was requesting was not “meager” under the meaning of the statute. Rather, the FBI’s request was “unprecedented and oppressive,” requiring Apple to design an entirely new OS and hire people to respond to future requests. The company also argued that the FBI didn’t need to obtain the information from Apple as it had other means to accomplish the surveillance, presumably through other government entities.
Apple’s final argument regarding the First Amendment was perhaps the most tenuous. It went something like this:
Code is speech. Since code is speech, the First Amendment applies, and Apple is free to express itself within the bounds of First Amendment jurisprudence. The FBI wants to compel Apple to write code and sign it such that it runs on the San Bernardino attacker’s phone. Apple doesn’t want to and has routinely expressed in public how much it supports encryption and privacy on its phones. Thus, the FBI wants to compel speech from Apple that is specifically against Apple’s desires, which is clearly a government restraint on free speech, making it unconstitutional.
There are several other arguments, mostly policy based, that did not arise in Apple’s briefs. Foremost among these is the argument that unlocking this hardware puts the keys to iPhones out into the wild, so to speak, loose in the digital ecosystem to be hacked by foreign nations and U.S. adversaries. It also opens Apple up to lawsuits from vendors who have relied on Apple’s encryption to keep their companies and government agencies secure.
Courts Inadequate—Congress Needs to Step In
It is important to note that had the San Bernardino attacker’s employer not tampered with the phone, the phone may have been unlocked. Moreover, though the attacker deleted the iCloud data on Apple’s severs, that data would also have been available had it not been deleted by the attacker on iCloud.
Apple’s increasingly far-fetched arguments, and the FBI’s singularly simplistic legal arguments, point to the inadequacy of the law to address the issue as it stands. When there is ambiguity and irrelevance in the law, courts can clarify these ambiguities or Congress can pass a new law or change an existing one. In this case, the latter is probably the best option: a clear and consistent law that recognizes that encryption is here to stay and provides law enforcement with reasonably limited law enforcement tools to obtain electronic information.