Complexity and Decidability in Unconventional Computational Models

Lead Research Organisation: University of Oxford
Department Name: Computer Science

Abstract

While today's computing has been an amazing success story, there is a growing realization that it covers only a small region within the landscape of the computational potential offered to us by nature. The British Computer Science community strongly acknowledges this fact: the UK Computing Research Committee, in its recent Grand Challenges Exercise, has put Journeys in non-classical computation forward as one of the key milestones for advance in knowledge and technology. However, once we enter the domain of unconventional computational paradigms, we find ourselves in largely unexplored territory where the well established methods and standards of Turing computing may become inapplicable or insufficient.An example that nicely illustrates this is Blakey's factorizing algorithm, which has constant space and time complexities, to be implemented on an unconventional physical computing device. At first sight, this constant complexity seems to challenge the standard cryptographic schemes that are based on the hardness of factorization, for which problem all known classical algorithms run in exponential time. Also, the whole endeavour towards quantum computing would be at stake, since it is strongly motivated by the fact that the run-time of Shor's quantum factorizing algorithm, to be implemented on a quantum computer, grows only polynomially. Of course there is a catch: the `precision' with which Blakey's system must be initialized and measured increases as an exponential function of the size of the number being factorized. That Blakey's system enjoys both time and space complexities that are constant in the size of the input value serves not to highlight the power of the method but to expose the fact that traditional complexity theory seems not to address all relevant resources. Motivated by this example, and others, we aim to address the following questions. For a given model of computation, what are the relevant and required complexity measures to quantify the hardness of problems and the cost (with respect to various resources) of solution methods? How can the complexity of computing devices, possibly from differing computation models and possibly with respect to differing complexity measures, be meaningfully and consistently compared? Is there a general, abstract, compositional theory for adjoining additional resources affecting the complexity of performing certain tasks in various computational models? The distinct computational models we wish to consider all have their own solution methods for given problems. To compare their respective efficiency in solving a particular problem---factorization, say---, we need a general, model-independent scheme for assigning measures of complexity; this scheme should be sufficiently flexible to account for a variety of resources that have an impact on this complexity. Steps in this direction have already been taken by Blakey, in particular with regard to precision. There is also a variety of related work, usually addressing specific problems and computational models, which will be of use. We want in particular to test and apply this framework to areas where we have great expertise. These include real-time and probabilistic automata, unconventional quantum computational architectures, and ground-state computing devices, among others. From a foundational viewpoint, an ultimate aim is to shed light on whether complexity is inherent in problems as opposed to solution methods (regardless of whether the latter be physical or algorithmic).

Publications

10 25 50
publication icon
Al-Mehairi Y (2017) Quantum Interaction

publication icon
Biamonte J (2011) Adiabatic quantum simulators in AIP Advances

publication icon
Biamonte J (2010) Adiabatic Quantum Simulators

publication icon
Blakey E (2013) Information Security as a Resource in Information and Computation

publication icon
Blakey E (2014) Ray tracing - computing the incomputable? in Electronic Proceedings in Theoretical Computer Science

publication icon
Blakey E (2009) Factorizing RSA Keys, An Improved Analogue Solution in New Generation Computing

publication icon
Blakey E (2011) Computational Complexity in Non-Turing Models of Computation in Electronic Notes in Theoretical Computer Science

publication icon
Blakey E (2013) Complexity-style resources in cryptography in Information and Computation

 
Description Complexity measures for unconventional computational resources.
Exploitation Route Have been spread through community
Sectors Aerospace, Defence and Marine,Digital/Communication/Information Technologies (including Software)