Efficient Algorithms for Mechanism Design Without Monetary Transfer

Lead Research Organisation: University of Liverpool
Department Name: Computer Science

Abstract

Matching problems involve assigning agents to commodities, such as junior doctors to hospitals, or kidney patients to donors, based on preference lists expressed by the agents. They have important large-scale practical applications in centralised matching schemes that perform allocations of these types in a number of countries.

When designing mechanisms for these matching problems, two issues present research challenges going forward: (i) optimality (based on the social welfare of the agents involved) and (ii) truthfulness.

These two desirable properties may be conflicting. This led Procaccia and Tennenholtz to introduce the idea of approximate mechanism design without money - this involves the design of truthful mechanisms which may lead to a loss of optimality in the constructed solutions, but such loss can be theoretically bounded.

In the above practical applications, monetary transfer is not appropriate. Two further applications that motivate optimal truthful mechanisms, where monetary transfer is not allowed, involve combinatorial auctions and facility location problems.

This proposal lies in the area of Algorithmic Mechanism Design (AMD), part of the wider field of Computational Social Choice Theory. We are interested in particular in mechanism design without money.

The famous Gibbard-Satterthwaite Theorem roughly states that, in environments without money, the only truthful mechanism is (the trivial) dictatorship. However, it does not apply whenever the domain of preferences of the individuals over the possible outcomes is restricted. That is the case in most real-world applications, where indeed more interesting mechanisms occur.

We plan to advance the area of AMD without payments by tackling combinatorial auctions, junior doctor placement and reviewer assignment, kidney exchange and facility location problems. We will develop new truthful mechanisms whilst measuring their degradation in performance as compared to previous (non-truthful mechanisms). In particular, we will investigate the trade-off between truthfulness and optimality when considering approximate mechanism design.

New software will arise from this work and we have routes for exploiting it through existing University of Glasgow collaborations with the NHS (concerning the assignment of junior doctors to hospitals as part of the Scottish Foundation Allocation Scheme, and the assignment of kidney donors to patients via the National Living Donor Kidney Sharing Schemes).

Planned Impact

Any organisation currently making use of a centralised matching scheme to allocate individuals to resources, or operating in any context where such a scheme might be appropriate, is a potential beneficiary of the project. Moreover the individuals who are participants in such matching schemes, and whose careers and lives can be significantly affected by the outcomes, are potential major beneficiaries.

We outline three public-sector areas, where organisations and individuals stand to benefit from the impact of the research described in this proposal.

* Junior doctor placement. The Scottish Foundation Allocation Scheme (SFAS) administered by NHS Education for Scotland (NES) is the centralised mechanism for allocating graduating medical students to two-year foundation training posts in Scottish hospitals. Applicants rank a small number of posts in order of preference, and conversely applicants are ranked according to a single scoring mechanism.

Particular new challenges that this project aims to address arise from the fact that (i) pairs of applicants who wish to be matched to hospitals geographically close by can make a so-called linked application in the SFAS context, and (ii) applicants may misrepresent their true preferences in order to obtain an improved outcome.

New algorithms arising from Workpackages 1 and 2.1 in the proposal will benefit (i) the schemes' administrators through labour-savings (ultimately translating into financial savings for NES) resulting from the matching process being fully-automated, and the algorithms producing stable matchings (thus reducing complaints from participants), and (ii) participants through increased social welfare and quality of life.

* Kidney exchange. NHS Blood and Transplant maintains a database of patients who require a kidney transplant, and who have a willing but incompatible donor that they could potentially "swap" with that of another patient in a similar position. An algorithm is used to identify these so-called "kidney exchanges". Altruistic (non-directed) donors are also included in the database.

A new challenge emerges from the possibility that large hospitals may be incentivised to withhold their easiest-to-match patients from the national scheme in order to improve their own surgical outcomes, thus diminishing the national pool and potentially denying other patients a transplant. Truthful mechanisms arising from Workpackage 1 and 3 in the proposal will aim to tackle this problem.

Clearly transplants resulting from such exchanges will have an enormous impact on the quality of life of the recipient patients. Not only are long-term survival rates much higher for transplantation patients compared to those on dialysis, there are substantial economic implications also: the cost benefit to the NHS of each kidney transplant compared to dialysis over a ten-year period (the median transplant survival time) is £241,000.

* Student placement. Students in the School of Medicine, University of Glasgow, are allocated to elective courses using a centralised matching scheme, taking into account student preference lists over courses. A particular challenge arises from the fact that students may deliberately shorten their preference lists in order to try to obtain their most-preferred course. New algorithms arising from Workpackages 1 and 2.2 will incentivise truth-telling by the students, whilst maintaining the quality of the outcomes.

Truthful mechanisms benefit the students because they are globally fair and increase the overall social welfare of the participants. They also benefit administrators as they lead to labour savings, and thus financial savings for the University. The model that underpins the assignment of students to courses is equally valid in the problem of assigning reviewers to conference papers. Thus new truthful algorithms can also benefit conference management software systems that produce reviewer assignments, such as EasyChair.

Publications

10 25 50
publication icon
Anastasiadis E (2018) Network Pollution Games in Algorithmica

publication icon
Anastasiadis E (2016) Computing and Combinatorics

publication icon
Anastasiadis E (2016) Network pollution games

publication icon
Auletta V (2015) Mechanisms for Scheduling with Single-Bit Private Values in Theory of Computing Systems

publication icon
Cechlárová K (2016) Pareto Optimal Matchings in Many-to-Many Markets with Ties in Theory of Computing Systems

publication icon
Chen N (2014) On revenue maximization with sharp multi-unit demands in Journal of Combinatorial Optimization

publication icon
Christodoulou G (2015) Price of Stability in Polynomial Congestion Games in ACM Transactions on Economics and Computation

publication icon
Christodoulou G (2018) On the Efficiency of All-Pay Mechanisms. in Algorithmica

publication icon
Christodoulou G (2013) Automata, Languages, and Programming

publication icon
Christodoulou G (2016) Bayesian Combinatorial Auctions in Journal of the ACM

 
Description We have designed mechanisms without monetary transfer for the problem of allocating resources to agents who submit preferences over these resources. An example is allocating students to dormitories, also known as the house allocation problem. Our starting point is one of the most widely studied mechanisms for the house allocation problem in economics literature -- the serial dictator. We generalise this mechanism allowing agents' preferences to include ties. Our mechanism is the first known that provides provable guarantees on the quality of the computed allocation (weight of the computed matching). This opened a completely new research direction, that of approximate truthful mechanisms, for matching problems under preferences (ACM EC 2014 and Algorithmica 2019). This research direction has been continued by economists and a related paper has been published by Bogomolnaia and Moulin in Games and Economic Behavior (2015). Our truthful mechanism provides an approximation with constant guarantee on its performance, assuming that each agent gives each object the same weight. We were also able to prove a strong non-constant lower bound on the performance of such truthful mechanisms if each agent is allowed to give different weights to each object (AAMAS 2016). This shows that our constant bound on mechanism's performance cannot be improved under this more general scenario. Moreover, the existing theoretical economics literature implies that even our constant approximation guarantee cannot be further improved within a broad class of truthful non-bossy mechanisms. Our recent results include a strong generalisation of these results from matching under preferences to many-to-many settings (SAGT 2015 and Theory of Computing Systems 2016) and to such matching problems where in addition we allow for matroid or knapsack constraints. In this latter setting we are able to strongly generalise our previous results from ACM EC 2014 to the setting where each object has its own matroid constraint. A feasible solution allows for any object to be assigned to many agents so long as they form an independent set of the respective matroid. The agents still have preferences with ties over the set of objects and weights. We were able to prove the same approximation guarantees in this very general setting as ones from our AC EC 2014 paper, however, by generalising our previous mechanisms and by using a completely different analysis. We have obtained similar results assuming that objects have knapsack-type constraints. These results were published in ICALP 2016.

Continuing on the theme of matching under preferences, we have also investigated problems that come directly from practical applications, such as the allocation of students to projects under capacity constraints on projects and lecturers (IWOCA 2014); in a follow-up paper we also considered the case where projects may have lower quotas indicating that they require a certain number of students to be viable (ISAAC 2015 and Algorithmica 2017). Another example involves the allocation of trainee teachers to schools in Slovakia, where specific constraints must be satisfied (CEJOR 2015 and Theoretical Computer Science 2016). We also considered the problem of assigning students to elective courses in the presence of pre-requisite and co-requisite constraints (COMSOC 2016). In a series of papers we have developed integer programming models to solve NP-hard optimisation problems that again feature in real-world applications, such as the assignment of junior doctors to hospitals where preferences can include ties (OR 2013) or where couples can submit joint preferences (OR 2013, SEA 2014, Constraints 2016). We used a similar technique to solve the kidney exchange problem subject to (hierarchical) optimality criteria that are specific to the UK setting (ACM EC 2016, ACM JEA 2014). Finally we have considered different models of preferences, where (i) agents may have to be acquainted to one another in order to disrupt a given matching (WADS 2013); (ii) agents may be unaware of their full preferences but can obtain information about them through a process of carrying out interviews (AAMAS 2016); (iii) certain partnerships may be forced or forbidden by a matching administrator (SAGT 2015 and Discrete Optimization 2016); and (iv) the lengths of preference lists are bounded (SAGT 2016).

Our second research strand concerns strong theoretical results for approximate truthful mechanisms for combinatorial auctions without monetary transfers. These mechanisms use a notion of verification, which can be an attractive alternative to monetary payments in real-world auction systems. Combinatorial auctions collectively form one of the paradigmatic problems in algorithmic mechanism design. In the setting of combinatorial auctions without money but with verification, we characterise the class of all truthful mechanisms and give a host of upper and lower bounds on the approximation guarantees obtained by either deterministic or randomized truthful mechanisms. Our results provide an almost complete picture of truthfully approximating combinatorial auctions in this general setting with multi-dimensional bidders. These results are published in AAMAS 2014 and in Algorithmica 2017. We continued the study of verification in context of greedy mechanisms for combinatorial auctions. Greedy algorithms are known to provide near optimal approximation guarantees for combinatorial auctions with multidimensional bidders, ignoring truthfulness issues. We provide a deeper understanding of greedy mechanism design and investigate under which general assumptions we can have efficient and truthful greedy mechanisms for combinatorial auctions. Towards this goal, we use the framework of priority algorithms, and weak and strong verification, and provide a complete characterisation of the power of weak verification showing that it is sufficient and necessary for any greedy fixed priority algorithm to become truthful with the use of money or not, depending on the ordering of the bids. Moreover, we show that strong verification is sufficient and necessary for the well-known greedy algorithm of Lehmann, Lehmann and Nisan, which is 2-approximate for submodular combinatorial auctions, to become truthful with money in finite bidding domains. Our work, published in AAMAS 2015 and in Journal of Artificial Intelligence Research 2018, is the first that studies the complexity of computing payments in mechanisms.

A parallel class of allocation problem deal with divisible resources, i.e., resources which can be divided into fractions and these fractions are allocated to agents (for instance oil or coal). We use here a weaker than truthfulness solution concept for mechanisms, namely that of Nash equilibria, and use a tool for analysing quality of Nash equilibria, called the Price of Anarchy (PoA). PoA shows how much worse the social welfare is under any Nash equilibrium compared to the social welfare of the best optimal solution. We study the efficiency of the proportional allocation mechanism, which is widely used in economics to allocate divisible resources. Each agent submits a bid for each divisible resource and receives a fraction proportional to her bids. We quantify the inefficiency of Nash equilibria by studying the PoA of the induced game under complete and incomplete information. We provide a set of bounds on PoA assuming that agents' valuations are concave, subadditive, and for settings with budget constraints and polyhedral constraints, and under Bayesian Nash equilibria and coarse-correlated equilibria. These results, published in SAGT 2015, show that the proportional allocation mechanism can successfully be deployed in these settings as it has small PoA.

In a related research direction we have applied algorithmic mechanism design to the important problem of pollution control (published in AAMAS 2016, COCOON 2016 and in Algorithmica 2019). In this work we introduce a novel network model of the pollution control problem. Our model comprises a graph whose nodes represent the agents (sources of pollution), and edges between agents represent the effect of spread of pollution. The government as the regulator is responsible for maximising the social welfare while setting bounds on the levels of emitted pollution both locally and globally. Our model is inspired by the existing literature in environmental economics. We study the social welfare maximisation problem in this model. Our main results include computational hardness results for the problem, approximation algorithms and truthful approximate mechanisms when networks are general bounded degree graphs, planar graphs (modelling air pollution) and trees (modelling water pollution in rivers). Our mechanisms on planar networks are obtained by a novel decomposition technique of planar graphs to deal with constraints on vertices, and this technique can be of independent interest.

Additional results:

Makespan minimisation in scheduling is the second, together with combinatorial auctions, paradigmatic optimisation problem studied in algorithmic mechanism design. It has a status of an unsolved problem in general. We design randomised truthful mechanisms for this problem for a restriction of the general multi-dimensional domain (i.e., unrelated machines). In a sense, our model is the simplest (difficult) multi-dimensional setting, where each machine holds privately only a single bit of information. Some of the impossibility results for deterministic mechanisms carry over to this setting as well. We prove a separation between truthful-in-expectation and universally truthful mechanisms for makespan minimisation: we first show how to design an optimal truthful-in-expectation mechanism, and then prove lower bounds on the approximation guarantee of universally truthful mechanisms. Our results are published in Theory of Computing Systems (2015).

Our SAGT 2015 paper studies the design of novel auctions that maximise the chance (probability) of raising a given revenue target. We provide a number of computational hardness results for various settings concerning such auctions. We then characterise an optimal revenue target auctions for the case of two buyers with independent valuations, which leads to such optimal auction mechanism. In settings with many buyers we design simple auctions (posted price and monopoly price auctions) that approximate their success probability with an arbitrarily good precision.

In our IJCAI 2015 paper we study the algorithmic mechanism design problem within one of the most successful electronic commerce applications paradigms -- that of Google's sponsored search auctions. We design envy-free sponsored search auctions, where bidders are budget-constrained; this is an assumption that is very important in e-commerce practice. Our primary goal is to design auctions that maximise the social welfare of the bidders and revenue of the auctioneer, which are the two classical objectives in auction theory. For this purpose, we characterise envy-freeness with budgets by proving several elementary properties including consistency, monotonicity and transitivity. Based on this characterisation, we come up with an envy-free auction that is both social-optimal and bidder optimal for a wide class of bidder types. More generally, for all bidder types, we provide two polynomial time approximation schemes (PTASs) for maximising social welfare or revenue, where the notion of envy-freeness has been slightly relaxed. Finally, in cases where randomisation is allowed in designing auctions, we devise similar PTASs for social welfare or revenue-maximisation problems.

The Price of Anarchy (PoA) in congestion games has attracted a lot of research over the last decade. This has resulted in a thorough understanding of this concept. In contrast the Price of Stability, which is an equally interesting concept, is much less understood. In our ICALP 2013 paper (with full version published in ACM TEAC 2015), we consider congestion games with polynomial cost functions with nonnegative coefficients and maximum degree d. We give matching bounds for the Price of Stability in such games, i.e., our technique provides the exact value for any degree d. For linear congestion games, tight bounds were previously known. Those bounds hold even for the more restricted case of dominant equilibria, which may not exist. We give a separation result showing that already for congestion games with quadratic cost functions this is not possible; that is, the Price of Anarchy for the subclass of games that admit a dominant strategy equilibrium is strictly smaller than the Price of Stability for the general class.

In a series of papers we investigate the query complexity of computing equilibria in strategic games. This approach has a very timely rationale, because in large scale applications, the players usually do not have full knowledge about their opponents' play or about the full game itself. Instead, they can query only certain subset of strategy profiles and their payoffs. In such settings, we study: (i) computational query complexity of learning exact and approximate Nash equilibria in games (Journal of Machine Learning Research 2015); (ii) query complexity of computing approximate equilibria for various classes of games (ACM EC 2014), and for anonymous games (WINE 2015).
Exploitation Route We opened a completely new direction for approximate mechanisms for matching under preferences. This direction and our approach has already been used by economists in different but related matching setting (Bogomolnaia and Moulin, Games and Economic Behavior (2015)). Some of our new techniques in this work are of independent interest and give a promise to be applicable to other mechanism design problems where agents' preferences have ties among alternative objects. For instance our ICALP 2016 paper presents a general technique to reduce the problem of designing truthful mechanisms for resource allocation problems where agents have ties to purely algorithmic unweighted version of the original problem.

Algorithms arising from this project are being used by NHS Blood and Transplant (NHSBT) in connection with their National Living Donor Kidney Sharing Schemes (NLDKSS). Also, optimal matching algorithms that have been developed during the course of this project are being used to allocate students to courses and projects, and also markers to projects, within various university departments (School of Medicine, School of Computing Science and School of Mathematics and Statistics , University of Glasgow; School of Informatics, University of Edinburgh; School of Engineering, University of Aberdeen; School of Medicine, University of Leeds; Schools of Biosciences and Physical Sciences, University of Kent and the School of Engineering, University of San Diego). For more details please see Section on Narrative Impact.

Our combinatorial auctions without monetary transfers but with verification can potentially be used in applications where there is no payments infrastructure available, but a natural realisation of verifications is possible. We present such possible applications in our papers. We have also made a first step in the study of computational complexity of computing the payments for various simple auction mechanisms with verification. This opens a new, unexplored and interesting research area. Our work has inspired new mechanism design models in pricing cloud storage (Ceppi, Kash, SIGMETRICS-2015) and truthful RAM allocation (Kovacs, Meyer, Ventre, WINE-2015).


Our new network pollution game model provides an interesting tool for governments to control pollution of various kinds. On the other hand this model opens new interesting research questions to be explored. Our new technical tools for planar graphs within this project are of independent interest and are likely to find applications for other such similar optimisation problems on planar graphs.

Our novel auctions with guaranteed revenue target should be applicable to real problems of fund raising. They also open a new direction for auctions design, where the design objective of a randomised auction is to maximise its success probability.

During this project we have extensively used the price of anarchy (PoA) and price of stability (PoS) method of quantifying the quality of equilibria in auctions and in other strategic games. We have developed a number of new theoretical tools for the analysis of PoA and PoS that will find applications elsewhere.

Finally, our results on the query complexity of equilibria should find applications in the context of real data used and generated by real-world applications that deal with independent agents. There are an abundance of such applications related to various Internet platforms.
Sectors Communities and Social Services/Policy,Creative Economy,Digital/Communication/Information Technologies (including Software),Education,Financial Services, and Management Consultancy,Healthcare,Government, Democracy and Justice

 
Description Algorithms arising from this project are being used by NHS Blood and Transplant as part of the UK Living Kidney Sharing Scheme (UKLKSS). These algorithms allow optimal sets of kidney exchanges to be identified, where in a kidney exchange, a patient with a willing but incompatible donor can swap their donor with that of another patient who is in a similar position in order to obtain a compatible kidney. Furthermore, the algorithms can find chains triggered by altruistic donors, which also involve incompatible patient-donor pairs, with the final donor donating to a patient on the deceased donor waiting list. These algorithms, which are executed as part of each quarterly matching run of the UKLKSS, have had to evolve to cope with changing requirements and increased pool sizes. As of February 2020, 940 transplants identified by the algorithms have proceeded to surgery. It is estimated that our research has led to an increase in the number of kidney transplants carried out in the UK by 292 up to the end of 2019. Compared to dialysis, every additional kidney transplant that this research has enabled saves the NHS approximately £241k over 10 years, which equates to total estimated savings of £70M. Two relevant papers from the project concerned with this research are http://dx.doi.org/10.1145/2940716.2940759 and http://dx.doi.org/10.1145/2670129. Also, optimal matching algorithms that have been developed during the course of this project are being used to allocate students to courses and projects, and also markers to projects, within various university departments. During the lifetime of this project our algorithms have been used for allocating students to elective courses within the School of Medicine, University of Glasgow, taking into account students' ranked preferences over courses and course capacities. The algorithms have also been used to allocate students to projects within the School of Computing Science, University of Glasgow, the School of Mathematics and Statistics, University of Glasgow, the School of Informatics, University of Edinburgh, the School of Engineering, University of Aberdeen, the School of Medicine, University of Leeds, the Schools of Biosciences and Physical Sciences, University of Kent and the School of Engineering, University of San Diego. Finally, the algorithms have further been used to allocate readers (second-markers) to projects at the School of Computing Science, University of Glasgow. By automating these complex allocation processes, the algorithms will have saved many hours of academic and administrative staff time, resulting in implicit financial savings for the institutions concerned. A relevant paper from the project in connection with this research is http://dx.doi.org/10.1007/978-3-319-19315-1_19.
First Year Of Impact 2013
Sector Communities and Social Services/Policy,Education,Healthcare
Impact Types Societal,Economic,Policy & public services

 
Description ENCKEP: European Network for Collaboration on Kidney Exchange Programmes (COST Action CA15210)
Amount € 720,000 (EUR)
Funding ID CA15210 
Organisation European Commission 
Sector Public
Country European Union (EU)
Start  
 
Description Impact Acceleration Account
Amount £29,640 (GBP)
Funding ID EP/K503903/1 
Organisation Engineering and Physical Sciences Research Council (EPSRC) 
Sector Public
Country United Kingdom
Start 01/2015 
End 09/2015
 
Description Institutional Sponsorship Fund
Amount £17,765 (GBP)
Funding ID EP/N508792/1 
Organisation Engineering and Physical Sciences Research Council (EPSRC) 
Sector Public
Country United Kingdom
Start 10/2015 
End 03/2016
 
Description ENCKEP: European Network for Collaboration on Kidney Exchange Programmes (COST Action CA15210) 
Organisation Erasmus University Rotterdam
Department Institute of Health Policy & Management (iBMG)
Country Netherlands 
Sector Academic/University 
PI Contribution Drafting the proposal for the COST Action
Collaborator Contribution Drafting the proposal for the COST Action
Impact Memorandum of Understanding published at https://www.cost.eu/actions/CA15210/
Start Year 2016
 
Description ENCKEP: European Network for Collaboration on Kidney Exchange Programmes (COST Action CA15210) 
Organisation INESC TEC
Country Portugal 
Sector Private 
PI Contribution Drafting the proposal for the COST Action
Collaborator Contribution Drafting the proposal for the COST Action
Impact Memorandum of Understanding published at https://www.cost.eu/actions/CA15210/
Start Year 2016
 
Title Kidney exchange web applications 
Description These websites find optimal solutions to kidney exchange problems. See: http://kidney.optimalmatching.com http://toolkit.optimalmatching.com http://www.dcs.gla.ac.uk/~jamest/weighted-toolkit/#/input 
Type Of Technology Webtool/Application 
Year Produced 2015 
Impact Algorithms that feature in these web applications are used by NHS Blood and Transplant for quarterly matching runs of the National Living Donor Kidney Sharing Schemes. More information about these algorithms can be found under "Narrative Impact". 
URL http://kidney.optimalmatching.com