The introduction of large-scale quantum computers would render almost all currently used key-exchange protocols useless. For optical fibre networks, quantum key distribution (QKD) seems to be the natural answer to this challenge, but it too has its limitations. On the other hand, there are alternatives that potentially could provide a relatively seamless replacement to current key-exchange algorithms.
Public-key cryptography is probably the single most important building block to ensure the security of electronic communication in our connected world. The strength of this system relies on the degree of difficulty for a properly generated private key to be determined from its corresponding public key. Mathematical functions that are relatively easy to calculate, but very difficult to invert (also known as trap door functions) can be used to derive the public-key cryptographic functions. The most famous and widespread algorithm of this type is the RSA algorithm, named after its inventors, Rivest, Shamir, and Adleman. Its security relies on the difficulty of factorising large integers into their prime factors, while calculating the product of primes is easy.
Although, for efficiency reasons, data encryption to avoid unauthorised reading is generally done by means of symmetric algorithms like the AES256 (advanced encryption standard with 256-bit key), public-key encryption is essential when exchanging the key for symmetric encryption. Encrypting a message with a recipient’s public key will allow only the legitimate recipient to decrypt that message with the corresponding private key. Calculating the private key from the public one (i.e. cracking the cryptographic system) requires solving the difficult mathematical problem. In this way, security is ensured by the complexity of the problem and the finite computing resources of an attacker.
In the case of RSA, there is no (classical) algorithm known that would enable large integers to be factorised within a relevant timescale. It is important to understand that the exponential-like growth in complexity with increasing RSA key size is what makes this problem suitable for this purpose. And, if computing power catches up, doubling the key size can increase the complexity of the codebreaking task by many orders of magnitude.
Quantum computing threat emerges
In 1994, Peter Shor published two algorithms that employ quantum properties to allow prime factorisation and calculating discrete logarithms in polynomial time (i.e. the complexity is growing rather moderately with key size). Although building a quantum computer of sufficient size to break modern public-key systems is a tremendous effort, the polynomial complexity of the quantum algorithms will eventually render all protocols useless that are based on these problems. This is true for all commonly used algorithms including RSA, ElGamal, digital signature algorithm (DSA), Diffie-Hellman and the elliptic curve variants of DSA and Diffie-Hellman. An increase in the key size would only lead to a polynomial increase in the complexity to break these algorithms, providing just a small security margin that is likely to be lost in a short timeframe.
These days speculation is flourishing that a useful quantum computer might not be many years away. By end of the year, Google aims to build a chip with 49 qubits (‘quantum bits’). This number is already close to what scientists suspect is the threshold of ‘quantum supremacy’ – where a quantum computer can solve a problem faster than a classical one. Breaking public key cryptography will require many more qubits, but there is a strong belief that scaling beyond this threshold is merely a question of engineering and does not require major scientific breakthroughs.
You may or may not share an optimistic view on the progress of quantum computing, but the topic is certainly hot following the US National Security Agency’s recommendation in August 2015 to prepare for upcoming quantum-resistant algorithms (and to no longer invest in elliptic curve cryptography). Consequently, the US National Institute of Standards and Technology has initiated a Post-Quantum Crypto Project, calling for proposals for quantum-resistant public-key algorithms. The aim is to standardise quantum-resistant cryptographic protocols within the next six to eight years, which means it might take up to ten years before new quantum-resistant solutions are developed and deployed. Even if a large-scale quantum computer is not available before then, it still means that all communication up to the upgrade to quantum-resistant encryption would be vulnerable to being recorded and broken later (also known as a harvesting attack). This means that, for sensitive data, depending on how long it needs to be kept secret, it might already be effectively too late.
Beating quantum computing with quantum cryptography
The main advantage of quantum cryptography, or specifically QKD, is a property known as unconditional security. QKD is secure, in principle, without restrictions on the computational resources or manipulation techniques of a potential attacker. It is important to understand that unconditional security is not absolute security – the security might still be compromised by implementation flaws – but it is not vulnerable to progress in computing power and mathematical algorithms, thereby enabling so called long-term security.
On the flip side, even though optical fibre networks are predestined to use QKD, since they already have a fibre in place that can serve as a medium for the quantum channel, QKD is severely restricted by attenuation, limiting the practical reach to well below 100km (see ‘A quantum of security’). Moreover, when sharing the same fibre for the quantum channel and co-propagating data channels, the distance is further reduced through nonlinear crosstalk. Nevertheless, QKD can be deployed for many practical applications like data centre interconnects, as has been demonstrated by ADVA in a joint field trial with Toshiba Research over a BT fibre link [1]. ADVA Optical Networking is also a partner in the UK Quantum Communications Hub which aims at better integration of QKD within commercial telecommunication networks.
Although it seems quite natural to use QKD to counteract the code breaking capabilities of quantum computers, it is not the only option. There is a class of protocols based on computational security, which rely on a problem that no known quantum algorithm can solve in polynomial time. These algorithms, known as post-quantum cryptographic (PQC) systems, are considered to be quantum-resistant, although they are still vulnerable to attacks should further progress be made in classical or quantum computation. There is a lot of research ongoing in this area currently, proposing and testing efficient PQC systems that could replace RSA and its relatives.
One of the best-known proposals for long-term protection against attack by a quantum computer that nobody has been able to break since its publication almost 40 years ago is the McEliece Cryptosystem based on Goppa codes or its Niederreiter variant. The drawback of this scheme is clearly the key size, which is much larger than currently used keys. However, given the high data rates in modern optical networks, the key size would still be feasible for this application, as ADVA recently demonstrated in a commercial optical communication system [2].
Greater than the sum of its parts
With two technologies competing to provide quantum resistance, the good news is that it is not necessary to choose between PQC and QKD; it is possible to use both in parallel and combine the individual keys leading to a combined key that is at least as difficult to crack as the individual ones. Even more, the combination should also improve overall security from a practical point of view (think of implementation flaws) because they are based on two very different mechanisms. Just keep in mind that practically all public-key cryptography in use today will have to be replaced when the upcoming quantum computer becomes available because all current methods are based on similar mathematical problems.
About the author: Helmut Griesser, Ph.D. is Director of Advanced Technology at ADVA Optical Networking. His focus is on digital signal processing and security in optical communication systems.
References
[1] Choi, I.; Zhou, Y. R.; Dynes, J. F.; Yuan, Z.; Klar, A.; Sharpe, A.; Plews, A.; Lucamarini, M.; Radig, C.; Neubert, J. & others: Field trial of a quantum secured 10 Gb/s DWDM transmission system over a single installed fibre. Optics express, Optical Society of America, 2014, 22, 23121-23128
[2] Cho, J. Y.; Griesser, H. & Rafique, D.: A McEliece-Based Key Exchange Protocol for Optical Communication Systems. Proceedings of the 2nd Workshop on Communication Security: Cryptography and Physical Layer Security, 2017, 447, 109