Hierarchies, Circuit Lower Bounds and Pseudorandomness

Lead Research Organisation: University of Edinburgh
Department Name: Sch of Informatics

Abstract

Computational complexity theory studies the possibilities and limits of efficient computation. The central question in this area is the P vs NP question, which asks if every computational problem solvable in non-deterministic polynomial time (NP) is also solvable in deterministic polynomial time (P). Solvability in P typically indicates feasibility in the real world.The NP vs P question is critical because several important problems, such as Boolean satisfiability, Integer Linear programming and Clique, are known to be in NP but not known to be efficiently solvable. The Clay Mathematics Institute has listed the NP vs P question among its seven ``Millennium Problems'' and offered 1,000,000 dollars for its solution - an indication of its centrality not just in computer science but also in mathematics. The question is also relevant to several other fields such as economics, biology and physics, because there are important problems in these fields as well which are known to be in NP but not known to be in P.It is widely believed that NP does not equal P, however it seems dauntingly hard to give a formal proof of this. A large part of the reason for this is the fundamental distinction between upper bounds and lower bounds on computational complexity. Proving that a problem is in (deterministic) polynomial time can be done just by giving an algorithm that runs in polynomial time and solves the problem. However, proving that a problem is not in polynomial time requires showing that every polynomial time algorithm fails to solve the problem. This is considerably harder because polynomial time is a relatively rich model of computation - it seems hard to isolate structural features which problems solvable in polynomial time have in common. The NP vs P question seems out of reach of current techniques, however there are other lower bounds questions which are more accessible such as the hierarchies question and the fixed polynomial circuit lower bounds question. These have natural motivations of their own - the hierarchies question asks whether more problems can be solved given more of a given resource, say probabilistic time, while the circuit lower bounds question asks if a problem can be solved efficiently in hardware. These questions are closely connected to each other, as well as to the question of derandomizing probabilistic algorithms. Many algorithms used in practice assume access to an independent and unbiased source of random bits. However, it's unclear whether true randomness exists in the real world. Thus it is of interest to minimize the randomness requirements of algorithms. The theory of pseudo-randomness addressed precisely this question - to what extent can randomness requirements of algorithms be minimized? This is particularly relevant to cryptography and security, since any cryptographic protocol must use randomness to be secure, and using imperfect randomness may make a protocol insecure.We propose to study hierarchies, circuit lower bounds and derandomization questions in this project, as well as the connections between them. Our goal is to prove new lower bounds, and in the process of doing so to formulate new lower techniques which could have other applications, perhaps ultimately even to the NP vs P question. We have also formulated two new directions for research on these fundamental problems, which connect to finite model theory and proof complexity. The hope is that perspectives and insights from these other areas might complement established ideas to achieve progress on these important and difficult problems.

Planned Impact

This research is likely to have impact in a variety of ways - through the relevance of the research area of the proposal to industry and society, by raising the profile of the U.K. as a site for state-of-the-art research, and more broadly through the far-reaching technological consequences of fundamental scientific research. This proposal falls within the area of computational complexity theory, focussing on the sub-areas of complexity lower bounds and pseudo-randomness. A complexity lower bound for a computational problem in a given computional model demonstrates a fundamental limitation of any algorithm within that model in solving that problem. Thus, it would save effort that might go into trying to find an efficient algorithm where none exists, and help to refocus attention on different computational models which give satisfactory performance in a practical context and for which the lower bound does not hold. The theory of pseudo-randomness is an essential part of cryptography, and hence very relevant to the modern world, in which our connectedness makes security an issue of ever-increasing importance. Many cryptographic protocols used in the real world are only heuristically secure, and are frequently broken with enough effort. Thus it is important to have provably secure protocols to encrypt sensitive information. The existence of provably secure cryptographic protocols for various tasks is closely linked to the existence of secure pseudo-random generators and to complexity lower bounds. Moreover pseudo-random generators are useful not just in cryptography but in any application where there is a need to save on the number of random bits used. The proposal will also have impact by raising the profile of the U.K. as a site for state-of-the-art research in complexity theory. Some of the world's leading technology companies, such as Google, Microsoft and Yahoo, have groups with expertise in algorithms and complexity. Moreover, these companies often have research labs that are closely associated with universities that have research strengths across the board, including in algorithms and complexity. This strongly suggests that a vibrant algorithms and complexity community would help the U.K. to compete in the global technology market. It would be a positive factor in inducing the top technology companies to set up research labs in the UK, but also in fostering home-grown talent by raising awareness and inspiring and encouraging breakthrough ideas in algorithms and cryptography. Finally, the proposal is likely to have an impact by virtue of being fundamental scientific research, but this component of impact is hard to quantify at present. The history of science teaches us that regardless of the context in which they were developed, solutions to fundamental questions often find important applications eventually. Fundamental research works by changing our way of thinking, introducing new conceptual models and architectures, and much of the impact of these changes can be long-term. We will aim to maximize impact by making the results of our research available quickly and in an easily accessible manner. Apart from publication in traditional research venues such as conference proceedings and journals, we will make our research available on the websites of the PI and the postdoctoral researcher. Furthermore, we will aim to promote our research and, more broadly, complexity theory by giving talks at universities across the U.K. and at nationwide gatherings such as BCTCS (British Colloquium on Theoretical Computer Science). The visits of the visiting researcher we have named - Prof. Toniann Pitassi - will also help considerably in raising the profile of complexity theory in the UK, since she is world-renowned in her field. We plan for her to give talks designed for a broad audience during her visits, to maximize impact.

Publications

10 25 50
 
Description The main goal of this project was to make progress on important open questions in computational complexity theory regarding complexity lower bounds for computational problems, as well as to explore new techniques to attack these questions. Together with Maurice Jansen, who was the main researcher on the project, I pursued these objectives in the context of arithmetic complexity, which studies the minimum number of arithmetic operations required to compute specified polynomials. In a series of papers published in top conferences in the area (ICALP 2011. ITCS 2012, STACS 2012), we generalized known lower bounds for computing the Permanent polynomial, and strengthened the connections between arithmetic complexity lower bounds and efficient solvability of the arithmetic circuit identity testing (ACIT) problem, which asks whether a given arithmetic circuit computes the zero polynomial.

Our results assume added significance in light of recent developments such as the Boolean circuit lower bound of Williams (CCC 2011), which take advantage of connections between lower bounds and solvability of the satisfiability problem, which is the Boolean analogue of the ACIT problem. By extending such connections to the arithmetic complexity setting, one might hope to solve long-standing open problems such as showing a super-polynomial arithmetic circuit lower bound for the Permanent. In a joint paper with Lance Fortnow (ICALP 2011), I introduced a new notion of lower bound for computational problems, and showed that Williams' lower bound could be strengthened to a lower bound under the new notion. More recently, in joint work with Williams (ECCC 2012), I showed new lower bounds for uniform circuits improving a 30-year old result of Kannan, and extended Williams' connection between the satisfiability problem and lower bounds to the validity problem for quantified Boolean formulas.
Exploitation Route They are likely to lead to progress in arithmetic complexity and circuit complexity. The findings are mainly of interest to complexity theory, but potentially also to algorithm design and cryptography.
Sectors Education,Other

 
Description The relevant papers have been cited a total of 28 times already.
 
Description EPSRC
Amount £13,750 (GBP)
Funding ID ESPRC Unspent Balance 
Organisation Engineering and Physical Sciences Research Council (EPSRC) 
Sector Public
Country United Kingdom
Start 12/2011 
End 03/2012