Post-Quantum Cryptography (PQC): a classical solution to the quantum threat
Introduction
Public key cryptography (or asymmetric cryptography) is one of the foundations on which all communication protocols are based. Starting from an insecure channel, these protocols must achieve multiple security properties, such as confidentiality or information integrity. This area involves various cryptographic techniques, including key negotiation algorithms and digital signature algorithms. These play an important role, both in terms of technological maturity and adoption in widely used protocols such as TLS, IPsec, or WireGuard.
Asymmetric cryptographic techniques base their security on the presumed intractability of certain mathematical problems, i.e., problems for which no algorithm is known that can find the solution in a computationally efficient manner and, even with the most powerful computing systems available today, any solution would require an impractical amount of time (thousands of years). Some examples are the integer factorization problem and the discrete logarithm problem, the former underlying the RSA cryptosystem and the latter the Diffie-Hellman protocol. These two problems constitute the entirety of those used in standardized asymmetric algorithms, and although they have withstood decades of cryptanalysis, recent developments in the field of quantum computing could pose a real threat. In particular, if a suitable quantum computer were to be built, Shor’s quantum algorithm (1994) would be able to solve the two problems mentioned above in a reasonable amount of time, impacting the security of all public key cryptography currently in use. Although the medium- to long-term feasibility of Shor’s algorithm remains uncertain given the significant technological challenges posed by the development of quantum computers, in 2015 the US National Security Agency (NSA) publicly stated the need for a transition plan to quantum-resistant cryptographic algorithms, i.e., algorithms that are resistant to attacks by quantum computers. In fact, although it is still difficult to accurately assess the timeframe in which quantum computers could pose a real threat to public key cryptography, it is desirable that a transition process be completed as soon as possible in order to minimize the risks associated with long-term information security. In fact, even today, an attacker could adopt strategies such as “store now, decrypt later,” intercepting and storing encrypted traffic with the aim of using a quantum computer in the future to decrypt the data. At the same time, the transition period to new cryptographic techniques is not negligible, and in this case, it is necessary to complete it before the development of a suitable quantum computer, otherwise information will be exposed to insecurity.
The main quantum-resistant techniques addressed in this scenario are Quantum Key Distribution (QKD) and Post-Quantum Cryptography (PQC). The first is a physical and infrastructural response to the problem of key exchange over an insecure channel: using the principles of quantum physics, it is possible to establish a protocol with unconditional security that allows two parties to negotiate a cryptographic key with which to subsequently protect their communications. This technique is not based on the assumption that certain mathematical problems are intractable, but has its roots in the laws of physics.
Post-quantum cryptography, on the other hand, is a classic response to the advent of quantum computing. PQC deals with the design of public-key cryptographic schemes that can be implemented on classical computers and are also resistant to quantum attacks. Consequently, post-quantum cryptographic techniques base their security on mathematical problems for which no efficient algorithms are known, regardless of their classical or quantum nature. For example, RSA and Diffie-Hellman, based on problems that can be solved with Shor’s quantum algorithm, cannot be considered post-quantum.
The focus of this article will be an introduction to PQC, a topic that will be further explored in subsequent articles focusing more on certain technical aspects of the subject.
The NIST Competition
The development of post-quantum cryptography received a significant boost in 2017, when the US National Institute of Standards and Technology (NIST) launched a standardization process to define and analyze new quantum-resistant public key cryptography schemes. In particular, the two categories of cryptographic techniques subject to standardization are Key-Encapsulation Mechanisms (KEMs) and digital signatures. As far as digital signatures are concerned, the adoption of post-quantum alternatives would fit transparently into existing protocols without changing their high-level structure. The situation is slightly different for KEMs: at the current state of the art, there is no practical post-quantum alternative to the Diffie-Hellman protocol that allows for “non-interactive” key exchange. Therefore, a technique called encapsulation, linked to public key encryption, is used for key negotiation, sometimes causing changes to the logical structure underlying current protocols.
The NIST competition received numerous proposals, totaling 64, including 45 KEMs and 19 digital signatures. Following the first three rounds of study and screening of candidates, on July 5, 2022, the winners of the first phase of standardization were announced, 1 KEM and 3 digital signatures, and the candidates subject to further evaluation in a fourth round, 4 KEMs. The mathematical problems considered by the third-round candidates were of various kinds, based on mathematical structures such as algebraic lattices, linear codes, multivariate equation systems, isogenies of supersingular elliptic curves, and symmetric primitives. These are mathematical problems that are a priori independent of each other, so that any critical issues in one do not imply security problems for the others. The families of problems are summarized in the table below:
| Family of PQC Problems | Selection of KEMs | Selection of Digital Signatures |
| Lattice-based | Kyber, NTRU, Saber | Dilithium, Falcon |
| Code-based | Classic McEliece, BIKE, HQC | |
| Multivariate-based | Rainbow | |
| Symmetric-based | SPHINCS+, Picnic | |
| Isogeny-based | SIKE |
Although the competition was scheduled to close in March 2022, the final outcome was announced a few months late due to bureaucratic complications, as stated by NIST. From a technical point of view, it is also worth noting that earlier this year, an attack was carried out on the Rainbow digital signature scheme, one of the finalists in the third round, based on multivariate cryptography. This led to greater caution on the part of NIST and further reinforced its intention, already expressed previously due to the lack of diversity among the remaining candidates, to reopen the competition in the digital signature category. In fact, two of the three standardized algorithms for this category are based on lattices, thus offering little variety in the possible schemes to be adopted. From a security point of view and considering the innovativeness of the proposed solutions, it is good to have schemes with as varied cryptographic assumptions as possible.
The Challenges of Transition
The ultimate goal in developing PQC is to replace classical algorithms within secure communication protocols, minimizing the impact of the transition to quantum-resistant solutions and potentially limiting the need for intervention at the protocol, network, and system levels. On the other hand, it is important to note that the process of transitioning from the current public key cryptography infrastructure to the new standards cannot be limited to a simple replacement of algorithms, and that the challenges introduced by the quantum threat will necessarily affect the functioning of the various levels that contribute to the implementation of the security infrastructure.
Despite their common features and interfaces, PQC algorithms are extremely heterogeneous, with differences in terms of key sizes, signatures and ciphers, computational requirements, and communication. For this reason, among others, NIST has stated that the competition will most likely lead to the standardization of more than one proposal, in order to cover a wide variety of requirements and use cases.
Increased computing resource requirements are a key issue in the adoption of cryptographic schemes by industry. Simply put, higher computational costs translate into higher costs in terms of time and energy consumption. These costs are directly measurable and sometimes represent an insurmountable limitation for devices with high traffic requirements or limited resources (e.g., IoT, embedded). Therefore, numerous efforts are focused on developing optimizations at the algorithm, software, and hardware implementation levels to meet the requirements of increasingly faster network connection protocols.
A seemingly less pressing issue, but one with equally significant implications in real-world scenarios, concerns the systematic increase in key and signature sizes in PQC proposals (see graphs below). Correctly assessing the impact in this case is more difficult, and we expect it to mainly affect the network level. For example, larger key sizes will likely affect the transmission of IP packets during the execution of secure communication protocols, introducing greater latency and higher bandwidth requirements. Sometimes, knock-on effects may occur caused by intermediate network devices (e.g., routers, firewalls) that are no longer compatible with the sizes of the new standards, leading to increased fragmentation and packet loss.


The transition to post-quantum algorithms is a complex but extremely topical challenge, requiring work on multiple levels and joint commitment from the worlds of research and industry. The field of security is constantly evolving, and the definition of new cryptographic algorithms is only one piece of the puzzle in the industry’s transition to quantum-resistant solutions. Due to the widespread use of cryptography, particularly for internet communications, any migration project necessarily involves a large part of the systems we use on a daily basis. In this regard, we have already seen in the past how similar processes, albeit typically less structural (e.g., transition from 3DES to AES, deprecation of MD5), have taken at least a decade to complete. In conclusion, it is essential that many of the transition processes start today, in order to anticipate the challenges and allow for rapid adoption of the new standards by industry.
This article is part of a series of contributions by the Telsy Cryptography Research Group dedicated to quantum computing and its implications for cryptography. For other articles, please refer to the index of the section.
For other articles related to quantum and cryptography, please refer to the relevant categories in the blog.
The authors
Edoardo Signorini, graduated in Mathematics with a curriculum in cryptography at the University of Trento, in 2020 carried out the Master’s thesis in the field of Post-Quantum Cryptography (PQC) at Telsy. He currently holds the role of cryptographer within the Telsy research group and carries out an industrial research doctorate in Pure and Applied Mathematics at the Politecnico di Torino. His activity focuses on PQC research and on the analysis and development of cryptographic protocols.
Giuseppe D’Alconzo, graduated in Mathematics with a specialization in Cryptography at the University of Trento, in 2019 he carried out an internship at Telsy, dealing with Multiparty Computation and Attribute-Based Encryption. He is currently a PhD student in Pure and Applied Mathematics at the Politecnico di Torino, with a “Post-Quantum Cryptography” scholarship within the UniversiTIM program and in collaboration with the Telsy Research Group.