In the last article, we discussed how to figure out when to start switching to post-quantum algorithms. In the next few articles, we will discuss the available post-quantum techniques and how to transition to them. The first step is discussing cryptographic primitives. These are the low-level building blocks that form the basis for secure algorithms and protocols. Traditional cryptography uses asymmetric algorithms like RSA and ECC as the basis for key agreement, digital signatures, and authentication. As we've discussed, these cryptographic primitives are vulnerable to compromise by future quantum computers.
The process for selecting new quantum-safe cryptographic primitives is being led by the United States National Institute of Standards and Technology (NIST). Cryptographers from all over the world have collaborated in teams and have contributed various candidate algorithms for consideration. The process has been underway for about two years now and is expected to take about three more years to complete. As part of this effort, NIST recently hosted their Second PQC Standardization Conference in Santa Barbara, Calif., with over 250 cryptography experts in attendance.
The process started with 69 candidate algorithms, and 26 algorithms still remain in the competition. Some algorithms were withdrawn due to security concerns and findings, while others were determined to be inferior to other similar algorithms that remain in the competition. Some teams even chose to merge and submit new candidates that had the best aspects of their original independent submissions. NIST is still open to future mergers and intends to select multiple algorithms at the end of the process, instead of a single winner. This is because none of the candidates are drop-in replacements for RSA and ECC. The algorithms contain various trade-offs between key generation time, key sizes, signing speed, and signature sizes, and it is unlikely that one single algorithm is best for all possible use cases.
The selection process is divided up into multiple rounds. During Round I, the primary focus was whether the algorithms delivered the security properties that they claimed to deliver. That is still important in Round II, but in this round, the performance of each algorithm plays a larger role. Round II will last 12-18 months, possibly followed by a third round. The results of Round I were published in NIST IR 8240.
The Round II candidates are distributed as follows:
|First Name||Signatures||Key Exchange / Encryption|
Here is a very brief summary of each category:
Lattice-based cryptography: A lattice is a mathematical term for an infinite, n-dimensional grid formed from integral linear combinations of vectors of different lengths and directions. The hard problem is typically finding the shortest non-zero vector in a given lattice. This is difficult to do without trying all possible combinations of the basis vectors.
Code-based cryptography: These algorithms are based on error-correcting codes. An efficient error-correcting code is then hidden within a significantly larger code, which is made public. The hard problem is that it is difficult to find the small code within the larger code.
Multi-variate cryptography: The algorithms are based on nonlinear solving systems of simultaneous polynomial equations over finite fields. These are basically large versions of the same problems that are studied in introductory algebra. Solving this hard problem, in general, is known to be NP-hard.
Symmetric-based cryptography: These are based on ideas like zero-knowledge proofs based on traditional symmetric cryptographic algorithms, which are quantum-safe with large enough keys. There are security proofs that show that if the underlying symmetric algorithm is secure, then the asymmetric algorithm based on it is secure. Decrypting classical symmetric algorithms is known to be a hard problem even for quantum computers.
Other: One candidate is based on supersingular isogenies, which is a very new asymmetric cryptographic technique based on abstract mathematical structures related to elliptic curves.
There is one additional category called hash-based signatures, which are not part of the competition because they are being standardized separately. The hard problem is that classical hash algorithms are also known to still be secure against quantum computers. They have some advantages and disadvantages which we will cover in a future article - the main advantage being that they are well understood and can be deployed now. NIST is planning to publish a standard for its use by the end of 2019.
The meeting also included an industry panel that expressed widespread support for hybrid cryptography, both now and in the future. That or a similar technique is required to support the transition between algorithms and avoids having a single point of failure that could potentially endanger cryptography again in the future. We'll discuss hybrid cryptography in the next article in the series.
Still navigating post-quantum cryptography and wanting to learn more or try it for yourself? Visit here to learn more.