The limits of Quantum Computing: an approach via Post-Quantum Cryptography
Lead Research Organisation:
Royal Holloway University of London
Department Name: Information Security
Abstract
Quantum computing (QC) is emerging as a critical technology for the future of computing. QC has been shown to provide significant - sometimes even exponential - speedups on various problems, and enable protocols that would be impossible using classical computers. On the other hand, some recent results on ''dequantized algorithms'' show that it is not always straightforward to quantify the quantum advantage on some problems. As a result, the strengths and limitations of quantum computing are still an open problem. One of the best benchmarks to evaluate the theoretical and practical limits of computing is cryptography. Indeed, cryptography is, by definition, the science of basing problems on the limits of computation. Arguably, the maturity of (classical) cryptography reflects our deep understanding of classical
computation. In contrast, post-quantum cryptography - building cryptography based on the limits of quantum computing - is very much an emerging field due to our limited understanding of quantum computing.
The emergence of post-quantum cryptography presents a tantalizing opportunity to study the theoretical and practical limits of computing. In the near-term, it can constitute a great benchmark for noisy intermediate-scale quantum computing (NISQ), providing concrete answers to questions such as: can a quantum algorithm beat any useful classical algorithm using a NISQ device of 1,000 qbits? In the long
term, more fundamental questions about the limits of quantum computers need to be answered. Beyond the known exponential and quadratic speedups that quantum algorithms can offer, one of the most promising aspects of those algorithms is to offer comparable running times with much reduced memory usage. Memory is arguably one of the most limiting aspects of classical computers. The exponential memory blowup of simulating quantum systems, for example, suggests that understanding the limits of quantum memories is essential. Post-quantum cryptography provides ample problems to study this aspect of quantum computing and answer questions such as: can quantum computing provide exponential memory improvements for some real-life problems?
I posit that lattices and codes, fundamental mathematical objects, will play a major role in answering the questions I have put forward. Lattices have emerged as a central object for both quantum computing and cryptography. Lattices and codes play a crucial role in post-quantum cryptography, with three problems standing out as particularly relevant: the shortest vector problem (SVP), the Learning with
error problem (LWE) and the syndrome decoding problem. These problems are fundamentally about the limit of quantum computing and suggest that lattices and codes are hard enough to be quantum hard but structured enough to provide nontrivial primitives. The SVP and LWE play not only a role in cryptography but also in quantum computing. Important search problems such as the dihedral hidden subgroup problem involve both problems. A recent breakthrough in the classical verification of quantum computations relies on LWE. LWE even enables classical parties to participate in secure quantum computation and communications protocols. Therefore, improvements in the understanding of SVP and LWE will benefit both the quantum computing and cryptography community. Furthermore, some recent improvements in lattice algorithms, that come from codes, show the benefit of studying lattices and codes together rather than separately.
computation. In contrast, post-quantum cryptography - building cryptography based on the limits of quantum computing - is very much an emerging field due to our limited understanding of quantum computing.
The emergence of post-quantum cryptography presents a tantalizing opportunity to study the theoretical and practical limits of computing. In the near-term, it can constitute a great benchmark for noisy intermediate-scale quantum computing (NISQ), providing concrete answers to questions such as: can a quantum algorithm beat any useful classical algorithm using a NISQ device of 1,000 qbits? In the long
term, more fundamental questions about the limits of quantum computers need to be answered. Beyond the known exponential and quadratic speedups that quantum algorithms can offer, one of the most promising aspects of those algorithms is to offer comparable running times with much reduced memory usage. Memory is arguably one of the most limiting aspects of classical computers. The exponential memory blowup of simulating quantum systems, for example, suggests that understanding the limits of quantum memories is essential. Post-quantum cryptography provides ample problems to study this aspect of quantum computing and answer questions such as: can quantum computing provide exponential memory improvements for some real-life problems?
I posit that lattices and codes, fundamental mathematical objects, will play a major role in answering the questions I have put forward. Lattices have emerged as a central object for both quantum computing and cryptography. Lattices and codes play a crucial role in post-quantum cryptography, with three problems standing out as particularly relevant: the shortest vector problem (SVP), the Learning with
error problem (LWE) and the syndrome decoding problem. These problems are fundamentally about the limit of quantum computing and suggest that lattices and codes are hard enough to be quantum hard but structured enough to provide nontrivial primitives. The SVP and LWE play not only a role in cryptography but also in quantum computing. Important search problems such as the dihedral hidden subgroup problem involve both problems. A recent breakthrough in the classical verification of quantum computations relies on LWE. LWE even enables classical parties to participate in secure quantum computation and communications protocols. Therefore, improvements in the understanding of SVP and LWE will benefit both the quantum computing and cryptography community. Furthermore, some recent improvements in lattice algorithms, that come from codes, show the benefit of studying lattices and codes together rather than separately.
Publications
Albrecht M
(2023)
Variational quantum solutions to the Shortest Vector Problem
in Quantum
Ambainis A
(2023)
Quantum bounds for 2D-grid and Dyck language
in Quantum Information Processing
Martin R. Albrecht
(2022)
Quantum Augmented Dual Attack
Description | Collabaration with Petros Wallden at Univeristy of Edinburgh |
Organisation | University of Edinburgh |
Country | United Kingdom |
Sector | Academic/University |
PI Contribution | Professor Martin Albrecht and I provided theoritical estimation of the number of qubits required in the varational quantum algorithm that we proposed for solving the shortest vector problem. |
Collaborator Contribution | Dr. Petros Wallden and his student Milos Prokop provided a classical simulation of our quantum algorithm on small instances. |
Impact | Variational quantum solutions to the Shortest Vector Problem, published at Quantum 2023. |
Start Year | 2023 |
Description | Interview for news website Tortoise Media |
Form Of Engagement Activity | A press release, press conference or response to a media enquiry/interview |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Media (as a channel to the public) |
Results and Impact | I was a panelist at the Responsible Quantum Summit organised by Tortoise Media. I intervened in the session: "What are the implications of advancements in quantum technology for the security?" together with the Head of Cybersecurity of Quantinuum Duncan Jones and the Co-founder & CEO of ORCA Computing Richard Murray. More than 50 journalists and industrial partners attended this event. The purpose is to raise the awareness of the advancement of quatum technology and its implications on security. |
Year(s) Of Engagement Activity | 2022 |
URL | https://www.youtube.com/watch?v=qh00I4Qp3s8 |