Solving Parity Games in Theory and Practice

Lead Research Organisation: University of Liverpool
Department Name: Computer Science

Abstract

Parity games are an intriguing problem class, because they are simple to state and have proven to be resistant to countless attempts to classify their complexity. At the same time, algorithms for solving parity games play a paramount role in model checking, satisfiability checking, and synthesis.

In this project, we will approach the objectives stated above as follows.

1. Determine or narrow down the complexity of solving parity and payoff games.

The fundamental open question is the membership of parity games in P. We will research in both directions, trying to improve the known upper and lower bounds and studying the relation between different types of parity and payoff games.

1a. Upper bounds

Assuming that solving parity games is tractable, the gold-brim solution would be to find a polynomial time algorithm for solving parity games. A second best upper bound would be to establish an FPTAS algorithm, where the number of priorities is the parameter. Further interesting questions are improving the dependency on the number of priorities, which have previously improved from n^p through n^(p/2) to n^(p/3) for parity games with n states and p priorities, and to improve the known sub-exponential bounds, which are currently n^\sqrt(n).

Another branch of research under this item is to estimate the complexity of known algorithms with unknown complexity.

1b. Lower bounds

Assuming that parity games are not tractable, finding a super-polynomial lower bound would be the gold-brim solution. However, there are no non-trivial polynomial bounds available, and proofs of lower bounds based, e.g. on the strong exponential time hypothesis, could provide a starting point for an intractability proof.

1c. Connecting 2 player parity with mean, discounted, and simple stochastic games.

There is a simple polynomial time reduction from parity through mean payoff and discounted payoff to simple stochastic games. The latter are polynomial time equivalent to the 2.5 player (2 antagonistic players and a random player) version of all of these games. We will research reductions in the other directions.

2. To describe classes of parity and payoff games that can be solved efficiently.

Many different cases where parity games can be solved in polynomial time are known. This prominently includes games with a bounded number of priorities, but also games with a bounded number of nodes of either player (especially one player games) and games with arenas that have a simple structure, such as bounded treewidth, DAG width, clique width, Kelly width, or entanglement.

We will further the research on restricted classes of graphs with weaker restrictions, such as local bounded treewidth, excluded minors, and nowhere dense graphs, and extend this analysis to payoff games. We will also further current research on the combination of partial solvers based on using polynomial algorithms that only solve sub-classes of parity games with the aim of defining increasingly larger classes of games that can be solved in polynomial time.

3. To develop algorithms for solving parity and payoff games with good performance.

To approach this goal, we use the insights from the first two workpackages to develop further algorithms with good performance, especially strategy improvement algorithms. An additional aspect we will focus on is to study distributed algorithms for solving parity games, harnessing the power of GPUs for solving parity and payoff games.

4. To develop fast algorithms for solving parity and payoff games.

Finally, we will implement the most promising algorithms in model checking and synthesis tools.

Planned Impact

The results of this project will lead to faster solvers for model checking and the broader field of verification. Verification is identified as a growth area by the EPSRC due to the high relevance that reliable software will have, including in proven reliability in avionic systems and healthcare. Being at the forefront of the algorithmic and technological advancement will give the United Kingdom the edge in the emerging key field of provably reliable systems.

This technological advantage will translate into economic benefits with the expertise for this high end market concentrated in Great Britain. The ability to verify the functional correctness of soft- and hardware will also lead to safer products and thus to a rise in quality standards that will benefit customers of high end products, where reliability is key. Examples of such products are medical devices and automotive systems.

Once introduced into the production chain, we expect these techniques to progressively develop from niche techniques for high end products into techniques commonly used in increasingly wider areas. This will strengthen the economic impact of the quality and reliability gains obtained through functional verification. It will make the advantage of high reliability available for an increasing share of the growing market for soft- and hardware products.

We have put measures in place to facilitate these benefits, including plans for follow-up research and PhD projects. We plan to facilitate our communication links with industry to discuss the potential of more applied follow-up projects, including KTP and CASE projects, European H2020 FET projects, and industry funded projects.

On a shorter term, we have put a public engagement policy in place for keeping an interested public informed on our progress and findings.

Publications

10 25 50

publication icon
Aceto L (2019) Adventures in monitorability: from branching to linear time and back again in Proceedings of the ACM on Programming Languages

publication icon
Apt K (2018) Verification of Distributed Epistemic Gossip Protocols in Journal of Artificial Intelligence Research

publication icon
Apt K (2019) Open Problems in a Logic of Gossips in Electronic Proceedings in Theoretical Computer Science

publication icon
Apt K (2017) Common Knowledge in a Logic of Gossips in Electronic Proceedings in Theoretical Computer Science

publication icon
Apt K (2022) Coordination Games on Weighted Directed Graphs in Mathematics of Operations Research

publication icon
Bloem R (2017) CTL* synthesis via LTL synthesis in Electronic Proceedings in Theoretical Computer Science

 
Description We have developed two of the new class of quasi-polynomial algorithms for solving parity games (one counting based and a recursive one).
Exploitation Route Improve the understanding of parity and related games (such as payoff games) further. Possibly establishing a class of relevant quasi-polynomial time problems.
Sectors Digital/Communication/Information Technologies (including Software),Other

 
Description (FOCETA) - FOUNDATIONS FOR CONTINUOUS ENGINEERING OF TRUSTWORTHY AUTONOMY
Amount € 4,985,540 (EUR)
Funding ID 956123 
Organisation European Commission 
Sector Public
Country European Union (EU)
Start 10/2020 
End 09/2023
 
Description (FouCo) - Foundations of Composition
Amount € 212,934 (EUR)
Funding ID 892704 
Organisation European Commission 
Sector Public
Country European Union (EU)
Start 04/2020 
End 03/2022
 
Description (SyGaST) - Synthesising Game Solving Techniques
Amount € 212,934 (EUR)
Funding ID 101032464 
Organisation European Commission 
Sector Public
Country European Union (EU)
Start 07/2021 
End 06/2023