More on the topic
Modern technologies, primarily computing, reach the limit of miniaturization, determined by the size of atoms. Classical models are not suitable for describing processes occurring at the atomic level, which led to a revolutionary development in understanding the laws of nature — the emergence of quantum physics. In turn, the development of quantum physics in the XX century made it possible to realize the concept of quantum computing, first expressed in the early 1980s (Yu. Manin, R. Feynman, P. Benioff).
Unlike a classical computer operating with elementary bits, each of which can take only two values: 0 or 1, a quantum computer works with finite sets of elementary states called qubits. A qubit has two basic states and can be in a state that is a linear combination of basic states with complex coefficients.
A simplified calculation scheme on a quantum computer looks like this: some initial state is recorded on the qubit system. Then, during the execution of the quantum program, the state of the system is changed by means of unitary transformations that perform certain logical operations. As a result, the state of the system is measured, which is the result of the work of the quantum program.
With the help of basic quantum operations, it is possible to simulate the operation of ordinary logic elements from which classical computers are built, thereby, a quantum computer in future is able to solve any problem solved on a classical computer, including cryptanalysis problems.
In turn, the operation of a quantum computer can be emulated using a classical computing system, for example, using graphical coprocessors, but such emulation is possible only for systems with a small number of qubits due to the exponential growth in the number of necessary logic elements
In some cases, the use of a quantum algorithm can give a significant increase in the efficiency of calculations. The most critical application of quantum computing for information security tasks is Shor's algorithm
for factorization and discrete logarithm, which gives an attacker the ability to effectively crack most of the public-key cryptosystems currently in use.