Editorial 

The Guardian view on quantum computing: the new space race

Editorial: The main use of quantum technology might not be to hack existing systems but to create unhackable communication networks of the future
  
  

Secure system on digital background
‘Cryptography remains a game of cat and mouse between codemakers and codebreakers. As fast as one group creates codes, another tries to break them.’ Photograph: D3Damon/Getty Images/iStockphoto Photograph: D3Damon/Getty Images/iStockphoto

The Washington Post’s columnist David Ignatius moonlights as a pacey novelist. His latest book, The Quantum Spy, is a thriller that touches upon the space race of our times: the great power contest to develop a quantum computer, able to work so fast that it can crack today’s uncrackable codes. Such a machine would be revolutionary. Modern e-commerce depends on encryption to protect confidential information. It is used to authenticate our identities and ensure the integrity of the data. To be able to break such codes would expose us all. Ignatius’s fiction is grounded in fact: the US National Institute of Standards and Technology thinks that within 15 years the first quantum computer will emerge to defeat the most prevalent forms of encryption.

However, cryptography remains a game of cat and mouse between codemakers and codebreakers. As fast as one group creates codes, another tries to break them. Unbreakable ciphers sometimes fall short. “Post-quantum cryptography” already exists, even before quantum computers do. Earlier this year academics suggested, controversially, that they had solved the maths to make “quantum-resistant” the main cryptography used on the internet. The main use of quantum technology might not be to hack existing systems but to create unbreakable protection for communication networks of the future. China claims to have launched such a network this year.

The main takeaway from Ignatius’s book is that China’s advances in quantum technologies might have the potential to undermine vital aspects of US power. In that he is probably not wrong.

 

Leave a Comment

Required fields are marked *

*

*