At the end of the last century, the field of experimental physics was revolutionized by the technological development of classical computers. Among the numerous innovations, the possibility of computer simulation of the behavior of subatomic particles has favored scientists’ understanding of the nature of some physical phenomena.
Despite the promising impact of the new processing systems, it became clear that the computing limitations of computers would have allowed the simulation of physical systems of limited complexity. In fact, the computational cost for the faithful reproduction of a system grows exponentially with the number of particles involved.
The original theorization of quantum computing, historically attributed to the Nobel Prize in Physics Richard Feynman, engaged in the study of matter and natural phenomena, is precisely placed in this context.
This article was written by Francesco Stocco, a master’s degree in Mathematics at the University of Padua and the Université de Bordeaux attending the “Algebra Geometry And Number Theory” (ALGANT) course of study, who joined the group of research in cryptography by Telsy at the end of 2020 focusing in particular on issues related to quantum technologies.
Read the full article on this page.
For more articles on the subject, visit the Quantum section of Telsy’s blog.