How does quantum computers affect cryptology?

Cryptography based on the quantum computer

Quantum computers would make large parts of the cryptography systems used today unusable. There are alternatives, but they are still a long way from being used in practice

The two physics Nobel Prize winners Serge Haroche and David Wineland have contributed with their research to lay the foundation for future quantum computers. This is how the Nobel Prize Committee justifies this year's award. It thus draws attention to a topic that gives many cryptographers a headache. Because quantum computers would make many of the encryption and signature processes used today unusable. That is why scientists are already researching which algorithms are resistant to attack attempts by quantum computers.

An important component of secure Internet connections - very practical, for example, calling up a website via https or sending an encrypted e-mail - are so-called public key algorithms. Their property: It is encrypted with a so-called public key, decrypted with a private key. Only the private key is kept secret. There are also public key procedures for digital signatures, here it works the other way round. A signature is created with a private key and can be verified with a public key.

Public key cryptography is an indispensable part of today's Internet. Because the alternative, so-called symmetrical encryption method such as AES, in which the sender and recipient use the same key, would lead to major problems in practice. Before they can be used, the keys must be transferred securely, for example through a personal meeting.

The best known and by far the most frequently used public key method is RSA, which can be used for both signatures and encryption. It was developed in 1978 and is named after its inventors Ron Rivest, Adi Shamir and Leonard Adelman. The RSA method is based on the factoring problem of large numbers. Factorization means to represent a number as the product of prime numbers. A simple example: the factorization of 15 is 3 * 5. Factoring very large numbers can only be carried out with enormous effort - and the security of RSA is based on this. Even with the best supercomputers available today, breaking a long RSA key (2048 bits are now considered secure) would not be feasible within a person's lifetime.

theory and practice

In 1994 the mathematician Peter Shor succeeded in showing that this would fundamentally change with a quantum computer. An algorithm developed by Shor would be able to factorize large numbers in a short period of time - however, at this point in time nobody could use this algorithm in practice, since quantum computers only existed as a theoretical concept.

Quantum computers are able to calculate in superimposed states. Roughly simplified, this means: You can carry out an arithmetic operation not only for a certain number, but for all possible numbers with a certain length at the same time and thus radically increase the speed. However, this is actually only a very simplified representation. In practice, only very specific problems can be solved with a quantum computer. In no way would they, as is often wrongly claimed, herald the end of all encryption processes.

As fascinating as quantum computers are in theory, in practice they face almost insurmountable problems. The individual memory modules, the so-called qubits, must be kept in an interlocked state. In principle, all physical particles can be used for this, but these must be kept isolated from the influence of other particles, otherwise the entanglement would collapse immediately. This is why research like that of Wineland and Haroche is so crucial. They have developed methods of observing individual particles under controlled conditions. For a quantum computer that breaks today's encryption methods, several thousand particles would have to be kept in an entangled state.

Many physicists doubt that quantum computers with significant computing power will ever be built. As a breakthrough, IBM was able to successfully use a quantum computer with 7 quantum bits in 2001 to factor the number 15. From a theoretical point of view, this is certainly very exciting, because Shor's algorithm was used in practice for the first time. But it is obvious: Such a quantum computer is completely useless for practical use.

An announcement by the Canadian company D-Wave that they had developed a quantum computer with 128 qubits for practical use was received with skepticism by the scientific community. For cryptography, however, the D-Wave computer is largely uninteresting, even if it works: It is not a universal quantum computer and it would not be able to execute Shors' algorithm.

Shors algorithm breaks all methods used today

But despite these prospects: Nobody today can completely rule out the possibility of a large quantum computer being built. For some years now, cryptographers have therefore been concerned with the question of what prospects cryptography would have after the quantum computer. The pioneer of this so-called post-quantum cryptography is the American professor Daniel J. Bernstein. Bernstein also says that it is unclear whether large quantum computers will ever become a reality. But nevertheless it is necessary to prepare for the worst case and to develop cryptography systems for a possible future with quantum computers.

Quantum computers would only pose a minor threat to symmetrical processes. Shor's algorithm doesn't matter here, but there is Grover's quantum algorithm which would reduce the speed of an attack on the square root. In practice this would mean: A key with a length of 128 bits would only have a security of 64 bits. About 80 bits are considered safe. The widely used AES standard can be used with a key length of 128 bits or 256 bits, so if you are afraid of quantum computers, you should avoid the shorter version. In general, the following applies: For symmetrical procedures and also for so-called hash functions, which often play an important role in cryptography, the quantum computer problem could be solved by adding more bits.

The situation is much worse with the public key method: Practically all methods used today are based on two mathematical problems: The factorization (RSA) mentioned above and the calculation of so-called discrete logarithms (DSA signatures and ElGamal encryption). Both are similar in terms of their mathematical structure and can be broken in a very short time using Shors' algorithm. The algorithms based on elliptic curves (e.g. ECDSA) that have been used more frequently in recent times use a modification of the discrete logarithm problem and also offer no protection against quantum computers.

Many algorithms failed

There is a reason why the number of algorithms in public key cryptography is so clear: In the past there were quite a few suggestions for new procedures that later turned out to be insecure. Including those that were considered promising candidates for post-quantum cryptography.

The EU research project NESSIE published a list of cryptographic algorithms in 2003 and recommended three methods for digital signatures, including the SFLASH algorithm. Not a good recommendation: In 2007 a team led by the cryptographer Adi Shamir (the "S" in RSA) was able to show that SFLASH can be completely broken without a quantum computer with little effort.

The story of SFLASH is by no means unique. Many proposals for public key procedures turn out to be insecure on closer examination. For a long time, for example, the relatively old McEliece method (presented in 1978 - one year after RSA -) was seen as a way of defying quantum computers. Although it was considered impractical because its keys are several hundred kilobytes in size, some exotic applications still use it, including the (now discontinued) Entropy privacy network. Daniel Bernstein put an end to the hopes for McEliece in 2008 and also presented an attack using classic computer technology.

So-called grid-based algorithms are currently considered a promising candidate for post-quantum cryptography. The public key method NTRU, which can be used for encryption and signatures, is the most developed here. A disadvantage for practice: NTRU is patented. This means that it is currently hardly conceivable to use it in the context of https. However, the patents expire in 2016.

When asked about future public key algorithms, one has to realize how trust in cryptographic processes arises: With today's means, it is hardly possible to mathematically prove the security of a process. Only a few algorithms that are unsuitable in practice, such as one-time pads, are an exception. A procedure is therefore considered to be particularly safe if it has been extensively examined by experts over a longer period of time. This shows the strength of RSA: cryptographers have been dealing with this for 35 years. The attack methods have become better and short RSA keys that were used in the past are now considered insecure. But so far all attempts to carry out an attack against RSA with long keys have failed.

This also shows the problem of alternatives such as NTRU. So far it has successfully resisted all attempts to shake its security. But since NTRU has so far only been used in niche applications, it is far less in the spotlight than RSA. An attack on RSA or another commonly used algorithm would be a sensation, and scientists are very incentivized to deal with it. An attack on NTRU should go down in the history of science at most as a footnote. (Hanno Böck)

Read comments (35 posts) https://heise.de/-3396243Report errorDrucken