Random structures, spin glasses and efficient algorithms

Lead Research Organisation: University of Warwick
Department Name: Mathematics


An Algorithm is a systematic procedure for solving a computational problem that can be implemented on a computer. An example is the Gaussian elimination method for solving a system of linear equations. The running time of an algorithm is the number of elementary steps (e.g., addition, modification of a symbol in a string, etc.) that the algorithm performs. Of course, the running time depends on the size of the input. For example, in the case of Gaussian elimination the size of the input is the number of symbols needed to write down (or enter) the linear system of equations. Denote this quantity by m. Then the running time of Gaussian elimination is bounded by m^3. Generally an algorithm is considered Efficient if for every possible input its running time is bounded by a polynomial in the size of that input. (Hence, Gaussian elimination is efficient.)In spite of intensive research since the early days of computing, there is a broad class of computational problems for which no efficient algorithms are known. In terms of complexity theory, most of these problems can be classified as NP-hard . One example is the Boolean Satisfiability problem (SAT). In this problem the input is a Boolean formula, and the objective is to find an assignment to the Boolean variables that satisfies the entire formula (if such a satisfying assignment exists).Although the SAT problem is NP-hard, it occurs as a sub-problem in numberless real-world applications. In fact, SAT is of similarly eminent importance in Computer Science as solving polynomial equations is in Algebra. Therefore, an immense amount of research deals with heuristic algorithms for SAT. The goal of this line of research is to devise algorithms that can efficiently solve as general types of SAT inputs as possible (although none of these methods solves all possible inputs efficiently).Despite this bulk of work, it remains extremely simple to generate empirically hard problem instances that elude all of the known heuristic algorithms. The easiest way to do so is by drawing a SAT formula at random (from a suitable but very simple probability distribution). Indeed, random input instances were considered prime examples of hard inputs to such an extent that it was proposed to exploit their hardness in cryptographic applications. Random SAT formulas also occur prominently in the seminal work on Algorithms and Complexity from the 1970s, where their empirical hardness was reckoned most vexing . However, it remained unknown why these types of instances eluded all known algorithms (let alone how else to cope with these inputs).Therefore, it came as a surprise when statistical physicists reported that a new algorithm called Survey Propagation ( SP ) experimentally solves these hard SAT inputs efficiently. Indeed, a naive implementation of SP solves within seconds sample instances with a million of variables, while even the most advanced previous SAT solvers struggle to solve inputs with a few hundred variables. SP comes with a sophisticated but mathematically non-rigorous analysis based on ideas from spin glass theory. This analysis suggests why all prior algorithms perform so badly. Its key feature is that it links the difficulty of solving a SAT input to geometric properties of the set of solutions.Though the physics methods have inspired the SP algorithm, they do not provide a satisfactory explanation for the success (or the limitations) of SP. Therefore, the goal of this project is to study these new ideas from spin glass theory from a Computer Science perspective via mathematically rigorous methods. On the one hand, we are going to provide a rigorous analysis of SP to classify what types of inputs it can solve. On the other hand, we intend to study the behaviour of algorithms from the point of view of the solution space geometry ; this perspective has not been studied systematically in Algorithms and Complexity before.


10 25 50
publication icon
Coja-Oghlan A (2017) Belief Propagation Guided Decimation Fails on Random Formulas in Journal of the ACM

publication icon
Coja-Oghlan A (2015) On independent sets in random graphs in Random Structures & Algorithms

publication icon
Coja-Oghlan A (2010) A Better Algorithm for Random k -SAT in SIAM Journal on Computing

publication icon
Coja-Oghlan A (2012) The Decimation Process in Random $k$-SAT in SIAM Journal on Discrete Mathematics

Related Projects

Project Reference Relationship Related To Start End Award Value
EP/G039070/1 01/04/2009 31/12/2009 £355,965
EP/G039070/2 Transfer EP/G039070/1 01/01/2010 31/08/2012 £295,184
Description The goal was to study the impact of ideas from statistical mechanics on the structure and algorithmic solvability of random
constraint satisfaction problems. The project has enabled us to make very significant progress towards this research goal
not only through the concrete results that have been obtained, but also through the development of new analytical
techniques. This progress has paved the way for the PI to secure follow-up funding from the European Research Council
("Phase transitions and computational complexity", grant id 278857).
This project has led to several publications in top journals and conference proceedings (e.g., the SIAM Journal on
Computing or the ACM-SIAM Symposium on Discrete Algorithms `SODA'). The discussion of the concrete results follows
the steps laid out in the original case for support.
1. Analysing the solution space
Perhaps the most important open problem regarding the geometry of the solution space has been the question of whether
a "condensation phase transition" exists, as predicted by physicists on the basis of non-rigorous arguments (PNAS 2007).
A major success of this project has been a rigorous proof that a this is indeed the case [6]. In fact, this work led to a very
illuminating combinatorial interpretation of the condensation transition. We expect that this will have an impact on both
computer science and statistical mechanics, where condensation has been a key topic since the experimental work of
Kauzmann on supercooled liquids in the 1940s.
2. Greedy algorithms, local search algorithms, and random walks
One of the strongest contributions of this project has been a new local search algorithm for the random k-SAT problem [1].
The new algorithm succeeds asymptotically up to the "dynamic phase transition" predicted by physicists and thus
outperforms previously analysed algorithms substantially for general values of k. The conference version of [1] received the
"best paper award" at ICALP 2009 (track A), the top theory venue in Europe.
A second key contribution was the sampling algorithm for colourings of random graphs [7]. This paper established a new
paradigm for sampling colourings, a benchmark problem in this area. This led to a vast improvement over previous
sampling techniques.
Further contributions deal with the Walksat algorithm [4] and the Metropolis process [3], two important random-walk type
3. Message passing algorithms
By developing new techniques we accomplished the first rigorous analysis of Belief Propagation guided decimation for the
k-SAT problem in a non-trivial setting [2]. This marks a significant first step towards a rigorous understanding of message
passing algorithms on random constraint satisfaction problems. The sobering result of [2] is that BP decimation does not
outperform the local search algorithms from [1] in this setup. The paper [2] establishes a link between BP and a certain
concept of quasi-randomness. In subsequent work [5] we established a connection between the analysis from [2] and the
phase transitions predicted on the basis of physics methods.
[1] A. Coja-Oghlan: A better algorithm for random k-SAT. SIAM Journal on Computing 39 (2010) 2823-2864.
[2] Amin Coja-Oghlan: On belief propagation guided decimation for random k-SAT. Proc. 22nd SODA (2011) 957-966.
[3] A. Coja-Oghlan, C. Efthymiou: On independent sets in random graphs. Proc. 22nd SODA (2011) 136-144.
[4] A. Coja-Oghlan, A. Frieze: Analyzing Walksat on random formulas. Proc. 9th ANALCO (2012) 48-55.
[5] A. Coja-Oghlan, A. Pachon-Pinzon: The decimation process in random k-SAT. SIAM Journal on Discrete Mathematics,
in press.
[6] A. Coja-Oghlan, L. Zdeborova: The condensation transition in random hypergraph 2-coloring. Proc. 23rd SODA (2012)
[7] C. Efthymiou: A simple algorithm for random colouring G(n, d/n) using (2 + epsilon)d colours. Proc. 23rd SODA (2012)
Exploitation Route This progress has paved the way for the PI to secure follow-up funding from the European Research Council
("Phase transitions and computational complexity", grant id 278857).
Sectors Other