The Quantum Computing

Quantum computing history

By the end of the twentieth century, experimental physics had been transformed by the technological rise of classical computers. Among many breakthroughs, the ability to simulate the behavior of subatomic particles on a computer allowed scientists to gain deeper insight into the nature of various physical phenomena.

Despite the promise of these new processing systems, it soon became evident that the computational limits of classical computers would only permit the simulation of physical systems of modest complexity. In fact, the computational cost of accurately reproducing a system grows exponentially with the number of particles involved.

The original idea of the quantum computer—historically attributed to Nobel Prize–winning physicist Richard Feynman, who was deeply engaged in the study of matter and natural phenomena—emerged precisely in this context. After several discussions with digital physicist Edward Fredkin, Feynman became convinced that the solution to these computational challenges lay in the very laws of quantum mechanics. He famously remarked:

“Nature is quantum, goddamn it! So if we want to simulate it, we need a quantum computer.”

Richard Feynman

Classical physics is concerned with the study of natural phenomena on a macroscopic scale, such as the relationship between mass and gravitational force, whereas quantum mechanics focuses on describing and understanding microscopic processes, like the interactions between subatomic particles.

What Feynman intended to emphasize with his statement is that physics and, more broadly, the sciences cannot avoid studying nature, and therefore matter, starting from the microscopic phenomena that determine it. Consequently, a computer designed to simulate these phenomena must be able to process information according to the same logic that governs them. Indeed, the logic governing the microscopic world differs fundamentally from the one we experience in everyday life, often leading to results that defy common intuition.

A striking example is the phenomenon of superposition, which, in simplified terms, can be thought of as the coexistence of multiple possible states for a given event. Take the toss of a coin: superposition can be imagined as the coexistence of intermediate combinations between the two possible outcomes—heads and tails—corresponding to all the configurations the coin assumes while spinning in the air.

euro

In fact, while the coin is still in the air, we know that once it lands, it will show one of the two possible outcomes—but it is not yet clear which one. Therefore, both possibilities must be taken into account.

This striking insight lies at the core of the unit of information processed by quantum computers. In nature, many microscopic phenomena can be described through the principle of superposition. For readers interested in delving deeper, certain behaviors of subatomic particles—such as their spatial position, energy level, and spin—are examples of this property.

To mention a medical implication, the physical process behind modern magnetic resonance imaging (MRI) relies on placing the spins of hydrogen protons within the human body into a state of superposition.

Since Feynman’s 1982 paper “Simulating Physics with Computers”, which introduced the quantum computer concept, the idea of harnessing quantum resources for computation has gradually extended to other fields of application—although it has largely remained at a theoretical level.

Thanks to their ability to encompass multiple possible configurations simultaneously, quantum phenomena have proven remarkably effective in modeling and solving combinatorial problems. For instance, we speak of a combinatorial problem when, given a set of possible solutions, the goal is to identify the optimal one. A well-known example in this context is the Travelling Salesman Problem (TSP).

Given a set of cities and a list of the distances between each pair of them, the TSP is defined as the task of finding the shortest possible route that visits each city exactly once and returns to the starting point. From a classical perspective, the computational cost of solving this problem grows exponentially with the number of cities involved, making the search for an optimal solution intractable on a large scale. By contrast, the combinatorial power of a quantum computer could significantly ease this challenge.

 

Telsy and the quantum computing

In light of recent technological advances, Telsy’s research team is developing a special series dedicated to exploring the implications of quantum computing in the field of cryptography—an area that has long been a cornerstone of the company’s expertise.

The goal of this series is to provide readers with a foundational understanding of the quantum ecosystem, and this article aims to serve as its starting point. What follows is a brief introduction to the main topics that will be further explored in future installments.

Consequently, the terminology and some of the concepts presented here are not covered exhaustively and will be discussed in greater depth in upcoming articles.

 

The quantum threat

In the field of cryptography—and more specifically, cryptanalysis—the new computational model has found remarkably significant applications. In 1995, with his paper “Polynomial-Time Algorithms for Prime Factorization and Discrete Logarithms on a Quantum Computer”, theoretical computer scientist Peter Shor introduced to the scientific community a quantum algorithm capable of solving the problems of integer factorization and discrete logarithms exponentially faster than any known classical technique.

The impact of Shor’s work stems from the threat it poses to modern asymmetric (or public-key) cryptography. Indeed, the RSA protocol for encryption and digital signatures derives its security from the assumption that the integer factorization problem is computationally infeasible to solve in a reasonable time using a classical computer. The same consideration applies to the Diffie–Hellman key exchange protocol, whose security relies on the classical intractability of the discrete logarithm problem. Even symmetric (or private-key) cryptography has not been immune to the rise of quantum algorithms.

In 1996, computer scientist Lov Grover proposed an algorithm capable of finding the shared private cryptographic key between communicating parties quadratically faster than classical brute-force search methods.

The impact on symmetric cryptography is not considered destructive, as protection against Grover’s algorithm can be achieved simply by doubling the length of the encryption key. The situation is quite different for asymmetric cryptography. If the quantum computer envisioned by Feynman were ever to become a reality, the RSA and Diffie–Hellman protocols would no longer provide adequate security guarantees.

Quantum Computer di IBM

This threat remained largely theoretical until the first decade of the new millennium, when major technology companies such as IBM, Google, and D-Wave began investing in the development of a quantum computer.

The efforts of these companies led to the creation of several small-scale prototypes, limited in both computational power and memory capacity. Although current prototypes have virtually no impact on real-world applications, they represent an encouraging starting point for future developments.

They also allow leading companies to outline more concrete technological pathways toward building a quantum computer with capabilities comparable to the one envisioned by Feynman. It remains unclear if and when this goal will be achieved, though the most optimistic estimates suggest it may take at least another decade. With this in mind, the cryptographic community cannot afford to be unprepared for the possibility that, in the medium to long term, Shor’s algorithm could be successfully implemented to break today’s public-key encryption schemes.

Moreover, these implications pose a critical challenge that cannot be ignored even when considering the protection of data today, particularly when such information must remain confidential for long periods of time. More concretely, an adversary seeking access to currently secure information could simply intercept and store encrypted data, with the intention of decrypting it in the future once quantum technology has advanced, a strategy often described as “store now, decrypt later.”

 

The cryptographic response

The cryptographic response to this threat is evolving along two main directions:

  • Post-Quantum Cryptography (PQC)
  • Quantum Key Distribution (QKD)

PQC represents the classical cryptographic community’s response to the advent of quantum computing. Its goal is to identify asymmetric cryptographic schemes whose security relies on mathematical problems believed to be resistant to quantum algorithms, and thus distinct from integer factorization and discrete logarithms. In this regard, the National Institute of Standards and Technology (NIST) is currently leading a process to standardize a new generation of cryptographic algorithms. QKD, on the other hand, is a technology that leverages the principles of quantum mechanics to enable key exchange with unconditional security—based not on computational assumptions, but on the fundamental laws of physics. Although many aspects of this technology are still the subject of academic and industrial research, several QKD solutions have already reached the market with medium-to-high Technology Readiness Levels (TRLs).

To continue ensuring the highest security standards for its clients in the so-called Quantum Era, Telsy is pursuing both directions: on one hand, through its long-standing expertise in classical cryptographyand on the other, through its recent investment in QTI, a spin-off of the Italian National Research Council (CNR) in Florence, specialized in quantum key distribution.

 

References:

Image 2: https://www.pngarts.com/explore/289623

Image 3: https://optimization.mccormick.northwestern.edu/index.php/Traveling_salesman_problems

 


This article is part of a series of contributions by the Telsy Cryptography Research Group dedicated to quantum computing and its implications for cryptography. For other articles, please refer to the index of the section.

For other articles related to quantum and cryptography, please refer to the relevant categories in the blog.

 

The author

Francesco Stocco, a master’s degree in Mathematics at the University of Padua and the Université de Bordeaux attending the course of study “Algebra Geometry And Number Theory” (ALGANT), joined the Telsy research group in Cryptography at end of 2020 focusing in particular on issues related to quantum technologies.