Distributed Quantum Computing and Applications

Lead Research Organisation: Imperial College London
Department Name: Electrical and Electronic Engineering

Abstract

Quantum mechanics has been known for weird nature it predicts: It allows several distinct states to exist simultaneously (quantum superposition) and super-strong correlations (quantum entanglement) between particles. Quantum computing (QC) makes use of this weird nature for faster and more accurate data processing and secure data protection than any conventional computers can offer. Five use cases for QC identified by the UK National Quantum Computing Centre are optimisation, quantum chemistry, fluid dynamics, machine learning and small molecule simulation.

The UK government has long recognised the future potential of QC, having established in 2014 a national network of Quantum Technology Hubs. Now, QC appears as one of three "foundational technologies" for the digital sector in the UK National Plan for Growth. International development in quantum technology is proceeding rapidly, with a recent, early academic demonstration of digital quantum advantage by the US technology monolith Google. Besides established companies, QC development has spawned many start-up companies worldwide with UK representations including Cambridge Quantum Computing, ORCA Quantum Computing, Oxford Ionics, Oxford Quantum Computing, Phasecraft, Quantum Motion, Rahko and Riverlane.

The increasing scale of QC raises several key technological challenges: (1) Isolated control and inter-system crosstalk, (2) Efficient classical monitoring and feedback, and (3) Efficient quantum access to large amounts of relevant problem data. Indeed, facing similar problems in conventional computing, researchers in information and communication technology (ICT) have been working on 'distributed computing', including cloud computing and optimal processing of distributed data. With ICT and QC researchers working together, this multi-disciplinary team will tackle the timely challenge of the design and efficient use of networked clusters of quantum devices for distributed quantum information processing (DQIP). While complementing the on-going efforts on scalable quantum computing, this project aims to develop a clear and feasible roadmap to practical DQIP and to introduce lynchpin design principles to enable cohesive efforts across each of the complex and strongly inter-related aspects of DQIP development. This project will therefore also contribute to showing significant quantum advantages as quantum systems grow toward the industrial scale, increasing certainty in the timeline and practical industrial evaluation of QC, laying a foundation for increased investment and growth in this area for the UK economy moving forward.

Specifically, this project will explore four key aspects of the design problem: (1) At the application layer, we set concrete structures and requirements for the algorithm and architecture; (2) At the algorithm layer, we define communication requirements in a hybrid environment of quantum and conventional processing nodes; (3) At the network layer, the required quantum processes will be optimised for the maximum connectivity; and (4) At the optical interconnect layer, the encoding and efficient transmission of quantum information in photonic systems will be studied. Our goal is to bridge the gap between QC and the established tools and methods in ICT, and to focus in on the strong network of inter-related constraints between these different aspects of the design problem to enable the development of practical DQIP. To achieve this goal, the project brings together an investigative team with strong track records for prior research in diverse and complementary fields, including computational finance and fluid dynamics, optimised networked systems and distributed computing, and quantum information and optics.

Publications

10 25 50