WhatsApp is on the offensive. In May 2021, the international social media giant sued the government of India. The point? To ask the Delhi High Court to declare unconstitutional India’s IT Rules—formally named the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, and recently enacted pursuant to India’s Information Technology Act, 2020. Those new rules, says WhatsApp, will violate users’ constitutional right to privacy.
The IT Rules require significant social media intermediaries in India to track, identify, and disclose the “first originators”—that is, the original senders—of information. Upon entry of duly-issued judicial court orders, social media companies would be required to identify the first originators and to provide an electronic copy of that information, but not the contents of any electronic message, any other information about the first originator, or any information about its other users. To trigger the mandate, the court order must relate to an offense of rape, sexually explicit or child sexual abuse material, the sovereignty, security, and integrity of India, friendly relations with foreign states, public order, or incitement of an offense relating to any of those.
The crux of WhatsApp’s argument is that compliance with the IT Rules would inevitably require it to disable its end-to-end encryption. That data privacy and security feature comprises the core of WhatsApp’s brand identity and has appeased a growing market demand. Eliminating end-to-end encryption, WhatsApp argues, would severely undercut the security of users’ communications and undermine their constitutional right to privacy.
That argument will resonate with WhatsApp users. Transitioning from an end-to-end encrypted app (which permits only the sender and the recipient to read or listen to a message or call) to an app without encryption would expose communications to greater risks of interception by unauthorized parties: hackers, criminals, government agencies, even social messaging intermediaries themselves.
It is not clear that the same federal statute would result in a violation of users’ constitutional right to privacy in the United States. Any U.S. equivalent of the IT Rules would, like the Indian original, require that an order to disclose the identity of the first originator be supported by a showing of probable cause. That would comport with Fourth Amendment requirements and reflect most of existing federal law.
The unexplainable resistance to that provision of the IT Rules, then, cannot arise from the belief that one’s electronic communications should remain private. As a society, we have accepted the sacrifices that come with preserving our national security. We understand that there are trade-offs we must make if law enforcement is to do what we have collectively entrusted it to do. We are willing to give up a part of our privacy because we know that what we gain in return is so much greater than that. We know the price of public health, safety, and welfare; of national security; of providing for the common defense. We pay that price day in and day out. We will continue to pay that price for as long as combating terrorism and human trafficking and mass shootings outweighs the desire to have our Snaps be seen by only two people—the sender and the recipient.
It’s not about privacy. It’s about more than that.
In December 2015, a mass shooting and attempted bombing at the Inland Regional Center in San Bernardino, California, killed 14, and 22 others were seriously injured. The FBI applied for an order requiring Apple to provide technical assistance in enabling the search of a locked iPhone used by one of the attackers. The court required Apple to create and install a custom operating system that would disable key security features on the iPhone, allowing access to stored data. Apple opposed the order. The FBI managed to gain access to the phone’s data without Apple’s help, mooting the issue.
In an amicus brief that may never garner the recognition it rightfully deserves, the American Civil Liberties Union identified the greater interest at stake when a government commandeers private companies to assist in criminal investigations:
The government seeks to compel an innocent third party to be an agent of the state, to conscript a private entity to develop information that is neither in its possession nor control. This is a tactic foreign to free democracies. It presents an unparalleled danger of eroding the public trust—both of government and between citizens—necessary to ordered liberty.
Public trust. Ordered liberty. Free democracies. It’s the perversion of these fundamental principles that induces an almost instinctual aversion to the unbridled power contemplated by the IT Rules. There is something deeply un-American about a government that can kick open the door, march into company headquarters, and tell Apple (or WhatsApp or Snapchat or Telegram or Signal or—for that matter—any company) what it will create today. What capabilities Apple’s new iOS system will, and won’t, have. What security features WhatsApp can and cannot offer. What services Snapchat can and cannot develop.
With the government at the helm, the barge of progress and innovation would never leave port. We would innovate only what government needs and wants us to innovate. We would supply only that which the government demands. A private company would be nothing more than an oxymoron, a cruel reminder that there would be no project, no research, no product development apart from that sanctioned by the government.
Of what significance is any of this? The IT Rules were passed in India, not in America. That’s not our legislation. That’s not our government. But the significance is monumental.
In 2014, former FBI Director James Comey testified before Congress about what has been dubbed the “going dark” dilemma facing law enforcement agencies at both the state and federal levels. The dilemma, put simply, is this: In accordance with federal law, and thus supported by probable cause, government agencies obtain a court order that they then serve on a social messaging company. The company turns them away because it simply does not possess the responsive information.
The company may offer end-to-end encryption in a way that only the sender and the recipient possess the unencrypted communications. It may abide by a policy that communications are deleted immediately after they have been opened and viewed by the recipient. In both examples, the company’s lack of internal records leaves government agencies empty-handed, frustrated at their inability to obtain the evidence they need in order to search, arrest, or prosecute bad actors.
The going dark dilemma isn’t going away. If anything, it’s being exacerbated by the proliferation of those company practices, as well as the development of further technologies in response to a growing market demand for data privacy. That’s why in March 2021, in response to the January 6, 2021, insurrection, current FBI Director Christopher A. Wray testified before a congressional committee and raised the exact same concern as his predecessor. This time around, the bureau made its demand clear and explicit—providers that manage encrypted data should be in a position to decrypt it and provide it to law enforcement in response to legal process. To do so, social messaging companies must either break their own encryption or, perhaps better yet, not offer it at all.
Isn’t that an incarnation of India’s IT Rules?
In more ways than not, the United States is nearing its own legal clash between social messaging companies and the government. A bill titled the Lawful Access to Encrypted Data Act, introduced in the House and Senate in 2020, would have provided a means of requiring telecommunications carriers to decrypt communications if the means of encryption or other encoding had been implemented or facilitated by the telecommunications carrier. The bill sought to amend the Communications Assistance for Law Enforcement Act (CALEA), which does not authorize any law enforcement agency to either require any specific design features or prohibit the adoption of any design features. Instead, CALEA currently provides that telecommunications carriers are not responsible for decrypting—or ensuring the government’s ability to decrypt—any communication encrypted by a subscriber or customer, unless the encryption was provided by the carrier and the carrier possesses the information necessary to decrypt it.
With end-to-end encryption, the recipient of the communication is generally the only party who holds the decryption key needed to decrypt the sender’s message. Under current law, therefore, social messaging companies like WhatsApp are not required to decrypt communications in compliance with a court order or to provide the government with a means of doing so. If the bill is reintroduced and enacted, however, providers would be required to render technical assistance that includes decrypting, decoding, or otherwise providing in an intelligible format the requested communications, regardless of whether the companies’ internal business practices even provide a means of doing so.
In that world, WhatsApp would be compelled to break its own end-to-end encryption or remove the feature altogether. That would cripple its brand identity.
The heightened emphasis on the going dark problem suggests that Congress is more inclined to adopt a fast and easy approach akin to that taken by India than to dedicate the time, effort, and resources to formulate a truly sustainable solution that properly balances the competing interests at stake. Bulldozing over social media companies with legislation that stifles innovation and arrests the progress of science and the useful arts will create more problems than it solves.
Such an approach would be entirely antithetical to the ideologies that underpin our democracy.