Polytope methods in parameterized complexity

Lead Research Organisation: Royal Holloway University of London
Department Name: Computer Science

Abstract

Linear Programming is a mathematical problem-solving tool that has proven
immensely useful in industrial planning, operational research, and in
mathematical optimisation more generally. Over the decades since its
inception, a deep and rich mathematical theory has developed around it,
which has become a central part of the field of theoretical computer
science. The field has also spawned multiple commercial companies,
including ILOG (now owned by IBM), who developed the CPLEX optimisation
suite, credited by IBM for improvements in business efficiency yielding
multiple cases of savings of hundreds of millions of dollars.

However, there is a disconnect between the theoretical and practical
strands of this research. In theoretical computer science, the focus is on
methods with absolute guarantees of performance, i.e., performance
guarantees (in terms of efficiency of the algorithm and quality of the
produced solution) that apply for every possible input to the algorithm in
question. Consequently, the set of algorithms considered is restricted to
those for which such "universal worst-case" guarantees are possible. On
the other hand, methods employed in practice, such as branch-and-bound and
branch-and-cut, are known to have great success with many "real-world"
instances, despite being highly inefficient in the rare worst case. In
other words, the coarse-grained problem view of theoretical computer
science leads to unnecessarily pessimistic conclusions.

We propose a study of combinatorial optimisation, and in particular of the
power of linear programming tools and branch-and-bound-type methods, from
the perspective of parameterized complexity. In parameterized complexity,
the coarse-grained view described above is replaced by a more
fine-grained, multivariate view of problem complexity, where the
feasibility of "easy" problem instances can be explained by some parameter
of these instances being bounded, i.e., we can use a structural parameter
to capture and quantify the relative instance difficulty.

This perspective has recently had some success, where branch-and-bound
algorithms have been shown to have a very good theoretically guaranteed
performance for certain problems, under the assumption that the so-called
"integrality gap" of these instances is bounded (a condition that is also
known to be relevant in practice). These results build upon some very
particular structural properties of the linear programming-formulation of
the problem, referred to as persistence and half-integrality -- properties
that have not previously been fully investigated by the theory community,
possibly since their value is not apparent under a strict coarse-grained
worst case perspective.

This project will investigate the conditions for such structural
properties in several ways, thereby laying the foundations for a theory of
structured problem relaxations, and using these tools to develop new and
useful algorithms for a range of important problems, both branch-and-bound
based and more traditional combinatorial ones.

Planned Impact

This research has the potential for significant long-term impact in the general areas of operational research, combinatorial optimisation, and discrete mathematics.

First and foremost, Mixed-Integer Linear Programming (MILP) methods are an essential tool for solving practical optimisation problems, both in an industry setting and more broadly. The research of this project points in several ways towards future investigations which could both improve our understanding of the efficiency of these methods, and lead to the development and deployment of improved methods for solving these problems in practice.

Regarding the former of these developments, it is common knowledge that MILP-solving methods are frequently efficient in practice, despite having
no non-trivial theoretical performance guarantees in general. That is, there is a significant gap between theory and practice, where the
(worst-case based) predictions from theory are found to be excessively pessimistic when compared to empirical real-world performance.

An analogous situation appears for SAT solvers, where modern CDCL (conflict-driven, clause-learning) SAT solvers are commonly praised as a near-universal solution method for several types of problems (including large-scale circuit verification problems), while on the other hand no non-trivial general performance guarantees are believed to be possible. In this case, a well-established theory of so-called backdoor sets has been developed to explain the efficiency of these methods.

One possible outcome of a future extension of this project in such a direction is a parameterized study of the efficiency if MILP solvers, analogous to the theory of backdoor sets for SAT. Preliminary research in this project points in this direction, including the study of the feasibility of using gap parameters in FPT algorithms.

There are also connections to be made regarding the identification of particularly effective cutting plane methods and well-solvable problem categories, extending the well-known class of max-flow problems.

Less broadly, the present project also suggests new algorithmic approaches for solving several problems which are of practical importance, in particular the classes of 0-Extension and Metric Labelling problems, which are natural problem formulations, motivated by practical applications. These approaches include fixed-parameter tractable cutting plane approaches (also known as branch-and-cut algorithms) for both natural parameters and gap parameters.

In addition to this, further long-term impactful applications are possible going via detours of generalisations of matroid theory and related topics in discrete mathematics, subject to successful outcomes of the corresponding sections of the project.

That said, all of the above impact cases would necessarily extend significantly beyond the end of the proposed lifetime of this project. The chief contribution of this project towards those impact stories would be to initiate these areas as fruitful venues of researh investigation, and to function as "seed example" for research to be built upon in later stages.

In turn, the best way to achieve such long-term impact will be to solidly ground the proposed research as relevant to the theoretical computer science community, and to engage actively with other researchers in related, relevant fields.

On a shorter time scale, although still reaching beyond the end of the planned project, algorithms developed in the scope of this project may also be of direct practical relevance. Ways to ensure this include the development of reference (open source) software implementations, either as independent software libraries or within the scope of existing projects such as COIN-OR, and to actively seek out and engage with practitioners where connections can be made.

Publications

10 25 50

publication icon
Gutin G (2019) Path-contractions, edge deletions and connectivity preservation in Journal of Computer and System Sciences

publication icon
Gutin G (2018) k-distinct in- and out-branchings in digraphs in Journal of Computer and System Sciences

publication icon
Kratsch S (2018) Multi-budgeted directed cuts

publication icon
Kratsch S (2019) Multi-budgeted Directed Cuts in Algorithmica

publication icon
Kratsch S. (2019) Multi-budgeted directed cuts in Leibniz International Proceedings in Informatics, LIPIcs

 
Description We note the two most significant findings; more exist, but do not fit the space limit.

1. A combinatorial condition for LP-based branch-and-bound FPT algorithms (SODA 2017).

Branch-and-bound algorithms over half-integral LP-relaxations lie behind some of the most powerful FPT results in combinatorial optimization, especially for providing efficient running times for seemingly difficult problems. However, the construction of such relaxations remains a difficult challenge, and it is not clear, for a given problem, how to fully test for the existence of one.

The main approach to constructing such objects involves the formulation of a problem as an optimization problem with purely local constraints (known as a Valued CSP or a VCSP), followed by the derivation of abstract mathematical properties of said VCSP. This approach is powerful when applicable, but testing applicability is highly non-trivial, especially since many optimization problems are more naturally defined in the language of graph theory, and in terms that do not in an obvious way reduce to purely local constraints.

In this work, we provide an alternative approach, which captures and extends most of the more powerful results mentioned above while also providing a purely combinatorial condition for applicability for new problems. The condition draws on a notion from matroid theory known as biased graphs, but the formulation is purely in the language of graph theory and the connection to matroid theory is purely inspirational.

In addition to the main result, another contribution of this work is a method for reducing the solution of a global optimization problem to the solution of a sequence of "rooted" (local) optimization problems, that are computationally much better behaved.


2. FPT results for 0-Extension and Metric Labelling problems (ICALP 2018).

Metric Labelling is a natural problem with multiple applications, and a rich history of being studied in terms of approximation algorithms (along with the restricted variant known as 0-Extension). The problem is loosely expressible as the task of extending a partially given labelling of a data set to a labelling of the entire data set, while minimizing a particular "inconsistency score" for the resulting labelling. We undertake the first study of these problems from the view of parameterized complexity.

We show tractability for the problem in a large range of settings, using a range of algorithmic approaches. We single out two results here. First, we note a powerful preprocessing algorithm (a.k.a. kernelization), which simplifies and reduces an instance without sacrificing solution quality; second, using the VCSP approach we show that if the metric embeds into a tree metric (which is the approach taken in most approximation work), then Metric Labelling has an algorithm whose running time is bounded (parameterized) purely by the quality of the relaxation gap.

The latter algorithm uses a generalization of the previously used branch-and-bound technique that imposes weaker restrictions on the relaxation compared to previous work, and should point the way forward to further applications of the approach.
Exploitation Route The research is long-term relevant to the fields of computational optimization and operational research, which are part of the digital sector but also of general importance to many sectors of the economy as optimization problems are encountered in many areas.

The work on a combinatorial condition for half-integral LP-relaxations (i.e., biased graphs) has attracted significant interest in the community and inspired follow-up work, in particular the development of highly efficient (linear-time) algorithms for several of the problems, by replacing the otherwise costly LP-solving step by a combinatorial path-packing algorithm.

The "weaker restriction" on the required properties of an LP-relaxation (technically, the Domain Consistency Property) holds the promise of having broader applicability than the restrictive "persistence" property and could similarly be picked up by researchers in these areas.

The algorithms for 0-Extension could also be of broader interest, especially in that they describe relatively simple algorithms with strong correctness guarantees.
Sectors Digital/Communication/Information Technologies (including Software)

 
Description PD 18 
Organisation Paris Dauphine University
Country France 
Sector Academic/University 
PI Contribution Direct research collaboration, i..e, expertise and direct input. Working on a joint research project.
Collaborator Contribution Hosted me as a guest. Provided travel and lodging. Also the hosts provided direct research collaboration, i.e., input of expertise to a joint research project.
Impact Publication: Eun Jung Kim, Stefan Kratsch, Marcin Pilipczuk, Magnus Wahlström: Solving hard cut problems via flow-augmentation. SODA 2021: 149-168
Start Year 2018