One of the most fundamental goals in public-key cryptography is the establishment of an authenticated private channel between two parties over some untrusted communication medium. Typically, this is handled by two distinct technologies: a public-key encryption scheme (PKE) used to establish shared keys between two parties for private transmission of data and a digital signature scheme to authenticate that one is communicating with the correct party. Once such a channel is established between two parties, further communication over the channel typically proceeds using a highly efficient private-key encryption algorithm, such as the Advanced Encryption Standard (AES) block cipher.
For nearly fifty years, in order to secure global telecommunications via such private, authenticated channels, we have relied on the original idea of the DiffieHellman key exchange along with influential cryptosystems such as RSA, developed by Rivest, Shamir, and Adleman in the 1970s, and the related mathematical concepts.
The Quantum Threat
The security of these traditional cryptosystems relies on the computational difficulty of factoring large integers and finding discrete logarithms. Yet in 1994, Peter Shor demonstrated a breakthrough algorithm for a hypothetical type of computer—a quantum computer—that would quickly solve these mathematical problems.2 While private-key communication, e.g., based on AES, would remain mostly unaffected, the security and utility of public-key cryptography would be entirely lost. Fortunately for those deploying commercial applications of public-key cryptography over the internet, building such a machine was beyond the scope of engineering capabilities of the day. But in recent years, there has been a substantial amount of research on quantum computers.
As things stand currently, many small or medium-scale quantum computers have been built by industry and verified to perform computations that no classical supercomputer could complete in many human lifetimes—a step known as achieving “quantum supremacy.” (This claim was first made, notably, by scientists at Google in an article published in 2019.)3 If large-scale quantum computers are ever built, at the scale that would enable Shor’s algorithm (or other, similar quantum algorithms) to be run, these breakthrough devices would gravely undermine the integrity and confidentiality of our current communications infrastructure on the internet and elsewhere.
While large-scale, cryptographically relevant quantum computers do not yet exist, this does not mean that the threat is not yet already pressing. One of the worst-case scenarios for the future involves a so-called store and decrypt later adversary. This is an attacker who intercepts and stores for many years encrypted communications sent under traditional, pre-quantum cryptography, waiting for the day that a sufficiently large quantum computer emerges so that they can break into all of the communications stored in their gathered database of ciphertexts. The best defense against this scenario is the rapid deployment of post-quantum cryptography, carefully studied by security analysts and benchmarked by engineering researchers to ensure its suitability for the commercial internet and other communication platforms.
A Brave, New, Post-quantum World
We stand today on the brink of a massive evolutionary step in modern communications: the imminent transition from traditional cryptography to so-called post-quantum cryptography. The goal of post-quantum cryptography is to develop cryptographic systems that are secure against both quantum and classical computers and can interoperate with existing communication protocols and networks. To facilitate this transition, the U.S. National Institute of Standards and Technology (NIST) initiated a worldwide process in late 2017 to solicit, evaluate, and standardize one or more quantum-resistant public-key cryptographic algorithms. NIST received sixty-nine submissions from around the world of post-quantum key exchange mechanisms and digital signature schemes. In early 2019, this list of candidates was cut to twenty-six. Of those candidates, seventeen were key exchange mechanisms and nine were digital signature algorithms.
A third round of evaluation then kicked off in the summer of 2020, focusing on a small list of four encryption finalists, three digital signature finalists, and a remaining pool of five encryption alternates and three digital signature alternates. This third round of NIST’s post-quantum cryptography standardization process is quickly coming to a conclusion (and—indeed—by the publication date of this article, is likely to have completed). At the end of the third round, it is expected that NIST will announce its initial selections for quantum-resistant encryption and digital signature standards.
Following this announcement, a small handful of alternate encryption schemes will proceed into a fourth round for later standardization decision-making. NIST has also announced it plans to initiate a parallel “on-ramp” procedure for additional solicitation, evaluation, and standardization of post-quantum digital signature algorithms sometime in the near future. Concurrently, NIST will begin drafting the formal standards documents for the initial post-quantum algorithms, and it is expected that these documents will be formally finalized sometime in 2023 or 2024. At this point, large-scale deployment of these new standards into the underlying software libraries used across the internet and other communication systems will soon commence with great haste.
How the Selection of Post-quantum Algorithms Was Made
In NIST’s original call for proposals for post-quantum cryptographic algorithms (then reiterated in various talks and public statements throughout the years-long process), a list of evaluation criteria was given. The primary selection criteria—by order of priority—was (1) cryptanalytic security of the candidate-algorithms, (2) computational performance of the candidate-algorithms, (3) algorithm and implementation characteristics of the candidates, and (4) any other relevant information outside of the first three categories.
In terms of security, NIST examined which levels of security each candidate’s portfolio offered. NIST defined five levels of security, known as Category 1 through Category 5, by analogy to the security levels offered by different variants of AES or cryptographic hash functions such as SHA2 or SHA3. For example, any scheme achieving Category 1 security should require an attacker to spend computational resources comparable to or greater than those required for a key recovery attack on the AES block cipher with a 128-bit key. At the highest level, schemes achieving Category 5 security should be roughly as secure as AES with a 256-bit key. For the audience, this means that computationally recovering the secret key of a Category 5 scheme should be expected to require about as many bitwise operations to be performed on the attacker’s computer(s) as there are atoms in the observable universe, even considering the potential advantages of future, large-scale quantum computation.
The typical way that security was evaluated inherently relied on the international, scientifically open aspect of the NIST standardization process. In particular, the most effective way to adjudicate the expected number of computational steps required to “crack” a given candidate was to consider the best possible cryptanalytic algorithm for attacking the scheme (recovering the private key from the public key or recovering the hidden message from a ciphertext). The best possible mechanism for NIST to obtain an understanding of what the best possible attacks that exist are involves soliciting such advice from the global community in an adversarial, competitive environment, where multiple teams are competing to achieve their work as being the standard.
In terms of performance, NIST examined the size of the public keys, secret keys, ciphertexts, and digital signatures produced by the candidate cryptographic schemes. A matter of some concern here is that many post-quantum cryptosystems involve generating or communicating cryptographic material that is larger than traditional pre-quantum systems like RSA. For example, an RSA ciphertext used to establish a private key of 128 bits may be as large as around 300 bytes. However, many proposed post-quantum encryption schemes would establish a 128-bit private key by sending ciphertext material as large as 800 bytes, 1000 bytes, or more. The case of signatures was often worse: Whereas pre-quantum digital signatures would again be only a few hundred bytes, many of the promising post-quantum digital signature schemes required a signature on the order of 2000 bytes, 20,000 bytes, or larger. An unfragmented IPv6 packet (an example of the minimum unit of information sent in one consecutive chunk across the internet) may only be as large as 1280 bytes, and so these increased sizes brought into question the suitability of certain candidates in terms of their ability to seamlessly interoperate with existing communication protocols and networks as a “drop-in replacement.”
NIST further carefully benchmarked the speed—in terms of the clock cycles and memory required for typical computing devices—of the various algorithm candidates. The crucial questions here involve the suitability of the new cryptographic candidates for use in real-world applications, based on the running time of their key generation algorithm, their encryption and decryption algorithms, or their signing and signature verification algorithms. Along with the size critieria, this led to a partitioning of the various candidates into two main buckets: (1) balanced and fast or (2) amortized.
As an example, consider the case of encryption algorithms. Two interesting finalists to consider here are CRYSTALS-Kyber (a “balanced and fast” lattice-based encryption finalist) and Classic McEliece (an “amortized” code-based encryption finalist). Whereas CRYSTALS-Kyber at Category 1 has a public key and ciphertexts around 800 bytes and key generation, encryption, and decryption algorithms running well under 1 millisecond, Classic McEliece has a public key of dozens of megabytes (a couple order of magnitudes larger) and slower key generation, but much smaller ciphertexts (128 bytes) and extremely fast encryption and decryption (at least an order of magnitude faster than CRYSTALS-Kyber).
Clearly, these two candidates have different real-world settings in which one will be better than the other, and vice versa. For example, CRYSTALS-Kyber might be more useful for an “ephemeral” private channel like establishing a secure connection using your browser to your favorite website requiring a password log in. On the other hand, Classic McEliece shines more in a scenario where a long-term public key could be included in the installation package download of some software that requires cryptographically secure messaging, like a virtual private network (VPN) application that connects your home computer to your business’s internal network.
Another important criterion that NIST used was attempting to rely on the conjectured intractibility of multiple “computational hardness assumptions” as possible. In particular, multiple categories of cryptographic systems are currently under consideration by NIST: Lattice-based encryption and signatures, code-based encryption, isogeny-based encryption, multivariate-based signatures, and other signature designs based only on symmetric-key cryptography.
Finally—and perhaps most interesting to the current audience—is the evaluation criteria regarding “algorithm and implementation characteristics” of a given candidate. Among straightforward criteria like the simplicity and clarity of the algorithms’ documentation, two especially interesting factors are (1) the so-called side-channel resistance of the various candidate algorithms and (2) any intellectual property or patent issues attached to the candidates. The former will be discussed below in terms of the upcoming and future, real-world transition to post-quantum cryptographic algorithm deployment, and the latter will be discussed afterward to conclude.
The Post-quantum Transition
We are now embarking on a critical period in modern cryptography, often referred to as the post-quantum transition. With the upcoming (or just-recent!) announcement by NIST of initial post-quantum cryptography standards, the gears of industry and government have begun to churn in order to ensure our digital communication platforms are resilient and secure against the near-term advent of large-scale quantum computers. The “Go” button has been pushed, if you will.
While the candidate algorithms have been scientifically examined and benchmarked for their security and performance aspects over five or six years during NIST’s evaluation period, their integration into the real world—into live software running on real systems used by millions or billions of people—is the next great challenge.
A crucial goal during this transitionary period is to ensure that these new post-quantum algorithms, when deployed, provide an absolutely minimal level of disruption to the operation of the commercial internet and other commercial communication platforms. That is, the end users of Google’s search engine or Microsoft Windows, Facebook, Reddit, Zoom, DoorDash, or an iPhone or an Android cell phone should not notice the difference. In fact, all devices and services should simply receive a small patch—automatically downloaded, perhaps overnight, likely as the end users are asleep in between one workday and the next—and everything should seem to operate the next day just as it always has before.
This is not a simple task. In fact, achieving this ideal goal will require the concerted efforts of a legion of software developers, security analysts, hardware engineers, and senior IT executives across industry as well as numerous collaborations spanning the various strata of government—not only within the United States, at the local, state, and federal level, but across every region and nation state of the world.
As a first-order issue, all of the so-called reference implementations (English program code that can be compiled to executable machine code that is then run on a system) that have passed through the NIST evaluation process are written in the C programming language. This is a good approximation and model of the live code that will be deployed soon, but it is not the entire story. Innumerable applications of the mathematical specifications of the new post-quantum standards will necessarily be implemented in other, diverse programming languages—translated and written by human experts. Ensuring that each of these new instantiations is free of security vulnerabilities is not a small problem.
Further, as highly optimized hardware designs—that is, algorithms burned into physical silicon—are being completed for the new post-quantum standards, it will become possible in particular for adversaries to explore concrete side-channel attacks against these specific hardware implementations. A side-channel adversary may go beyond the typical notion of a passive eavesdropper on (say) internet traffic, and instead actively interfere with the operation of a given cryptographic device to try to extract useful, secret information. One example, among many, would be an adversary that injects faults into the hardware during the execution of a cryptographic algorithm (using a power surge, or some other mechanism) in the hope that the device will mistakenly output information that reveals the secret key embedded in the device.
Even more simply, an adversary may gain physical access to a cryptographic device itself, e.g., some hardware security module (HSM) or even a smart card (used by an honest user to scan themselves through a front door of a business’s building, or similar). Given the ability to physically manipulate the device, the attacker may attach an oscilloscope to the pins or ports of the device and then study the electronic power consumption of the device while it honestly performs its cryptographic behavior. From this information, a sufficiently sophisticated attacker may further try to infer useful information about the secret key material contained in the device.
While many generic techniques exist to produce implementations of cryptographic algorithms that resist such “side-channel” analyses, the majority of such techniques have been designed over the decades with prior, pre-quantum algorithmic techniques in mind. Indeed, even some twenty years after NIST standardized the AES for private-key communication, there still exists a significant, ongoing research effort to attack and defend AES against such analyses, including timing attacks, power analysis attacks, fault injection attacks, acoustic attacks, and beyond.
One should expect that side-channel analyses, attacks, and defenses will continue for the new, post-quantum, standardized algorithms for many years and many decades to come.
Life at the Intersection of Modern Cryptography, Public Research, and International Law
To conclude the discussion here, we briefly (and very gently) discuss the state of international intellectual property and patent law with respect to the end of NIST’s post-quantum standardization process. The environment in which the development of the candidate algorithms submitted to the NIST post-quantum standardization process in late 2017 occurred was substantially different than that of the standard, pre-quantum algorithms, such as RSA in the 1970s. While the phenomena involved are so vast and amorphous that no crisp or perspicuous analysis of the whole matter can avoid being procrustean, it should nonetheless be possible to say something helpful.
As far as we are aware, public research into modern cryptography only truly began with the International Cryptology Conference (CRYPTO) in the early 1980s. (The 1970s have some important exceptions: e.g., the work of Diffie and Hellman; Merkle; Rivest, Shamir, and Adleman; and others.) Prior to this since-annual event in beautiful Santa Barbara, California, the overwhelming portion of cryptologic research only occurred within the confines of national governments and their trusted, business contractors. It wasn’t until 1996 that the ePrint Archive (https://eprint.iacr.org) was established to provide rapid access to recent research in cryptology from the public academic community.
However, from that point on, many universities—their professors and graduate students—from around the world have heavily engaged in the subject of cryptologic research. Innumerable encryption schemes, digital signature schemes, and a wild variety of other cryptographic systems have since been designed and vetted through the scientific peer review process. Said in short, the earlier, secret art of cryptography has since transitioned to a modern, public science of cryptography.
The effect of this private to public transition in the science of cryptology has taken time to have its effect on the international standardization process for cryptographic algorithms, but it seems to have fully arrived. In particular, every sufficiently technically significant algorithm developed since that time is no longer just the work of one person or one group but is instead the culmination of a sequence of distributed research efforts by multiple, disparate groups across multiple points in time and multiple geographic locations. The study of strong and provably secure cryptography has become a truly global matter.
The question that arises, then, is how to adopt an international standard for cryptographic algorithm deployment in the situation where the best possible choices are the product—yes—of a single team of cryptographers but are also reliant on individual, peer-reviewed, published, or patented insights from many other researchers throughout the years. Within the context of the NIST PQC process, this has been heavily discussed on the public PQC-Forum hosted by NIST. While it seems likely, in this particular case, that all of the interested parties will come together to jointly bless a single, standardized path forward, it is not at all clear that this necessarily will occur again in the future (at least, not without additional, great care in future standards processes).
While the NIST post-quantum standardization process began by asking each submission team to declare any relevant patents that they held on the algorithms that they were submitting, the process was agnostic to various other potential patent threats. A first example is whether one submission team held patents (or were aware of patents) that applied to their competitors in the standardization process. A second concern is whether external, third parties privately held patents that applied to critical submissions to the NIST process. A third issue is how should a cryptographic algorithm standardization process proceed when patents are known, but the patent holder declines to provide an immediate FRAND license-to-use for end users of the final standards product.
Often, the case in such procedures is that a given patent clearly and indisputably does not scientifically read on the merits of the algorithms submitted to the standardization process, but nonetheless, the patent holder threatens to sue end users (vendors in industry or government, nationally or internationally) who implement and deploy the final standard. Regardless of whether the top scientific experts in the field agree on the applicability of the patent to some potential standard (academics do not have legal training!), the simple threat of legal action based on intellectual property can significantly curtail industry’s adoption of the standard.
An interesting question for future NIST processes is whether—and to what degree—to pursue efforts for comprehensive patent discovery (that is, finding all potential patents against any candidates in any future standards processes) and whether to pursue—much earlier in the standards process—declaratory judgment action against such threats. Typically, such declaratory judgments, particularly if they are to be useful in every jurisdiction, may take many years to complete and are a large-scale effort that is mostly legal in nature. This would be different in flavor than most NIST standardization processes, historically speaking, which have primarily focused on the most effective scientific outcomes alone. Moreover, intellectual property right issues have been a burden on the adoption of good, strong cryptography well beyond the scope of NIST for many decades.
What should your organization do with regards to such legal challenges in the realm of cryptography? Likely, the best answer is “do nothing” (at least, as far as explicit legal action is concerned). However, it will be incredibly valuable to pay attention to such discussions where you can find them, and to see where things land in the end. We hope—for all of the attorneys in the audience, either in industry or local, state, or federal—that this exposition has been a useful primer for the deep complexity (and high importance!) of the issues involved with the transition to post-quantum cryptography across the world in the very near future.
Endnotes
1. Whitfield Diffie & Martin Hellman, New Directions in Cryptography, 22 IEEE Transactions of Info. Theory 644 (Nov. 1976).
2. P.W. Shor, Algorithms for Quantum Computation: Discrete Logarithms and Factoring, Proc. 35th Annual Symp. on Founds. of Computer Sci. 124 (IEEE, Comp. Soc. Press, 1994).
3. Frank Arute et al., Quantum Supremacy Using a Programmable Superconducting Processor, 574 Nature 505 (2019).