BackgroundModern technologies — computing technologies in particular — are approaching the limits of miniaturization imposed by the size of atoms. Classical models are inadequate for describing processes at the atomic scale, which led to a revolutionary new understanding of the laws of nature — quantum physics. In turn, advances in quantum physics throughout the 20th century made it possible to develop the concept of quantum computing, first proposed in the early 1980s by Yu. Manin, R. Feynman, and P. Benioff.
Unlike a classical computer, which operates on elementary bits — each of which can take only one of two values, 0 or 1 — a quantum computer operates on elementary quantum states known as qubits. A qubit has two basis states and can exist in a state that is a linear combination of those basis states with complex coefficients.
A simplified picture of quantum computation works as follows: an initial state is encoded into a system of qubits. The state of the system is then transformed through a sequence of unitary operations, each performing a specific logical operation, as the quantum program executes. Finally, the state of the system is measured — and this measurement constitutes the output of the quantum program.
Using elementary quantum operations, it is possible to simulate the behavior of the classical logic gates from which conventional computers are built. As a result, a quantum computer is in principle capable of solving any problem that a classical computer can — including problems in cryptanalysis.
Conversely, the behavior of a quantum computer can be emulated using classical hardware — for example, using graphics processing units (GPUs) — but such emulation is feasible only for systems with a small number of qubits, due to the exponential growth in required computational resources.
In certain cases, the use of quantum algorithms can yield dramatic gains in computational efficiency. The most critical application of quantum computing from an information security perspective is
Shor's algorithm for integer factorization and discrete logarithm computation, which would enable an adversary to efficiently break the majority of public-key cryptosystems currently in use.