# A coalgebraic framework for reductive logic and proof-search (ReLiC)

Lead Research Organisation:
University College London

Department Name: Computer Science

### Abstract

While the traditional, deductive approach to logic begins with premisses and in step-by-step fashion applies proof rules to derive conclusions, the complementary reductive approach instead begins with a putative conclusion and searches for premisses sufficient for a legitimate derivation to exist by systematically reducing the space of possible proofs.

Not only does this picture more closely resemble the way in which mathematicians actually prove theorems and, more generally, the way in which people solve problems using formal representations, it also encapsulates diverse applications of logic in computer science such as the programming paradigm known as logic programming, the proof-search problem at the heart of AI and automated theorem proving, precondition generation in program verification and more. It is also reflected at the level of truth-functional semantics --- the perspective on logic utilized for the purpose of model checking and thus verifying the correctness of industrial systems --- wherein the truth value of a formula is calculated according to the truth values of its constituent parts.

Despite the reductive viewpoint reflecting logic as it is actually used, and in stark contrast to deductive logic, a uniform mathematical foundation for reductive logic does not exist. Substantial background is provided by the work of Pym, Ritter, and Wallen, but this is essentially restricted to classical and intuitionistic logic and, even then, lacks an explicit theory of the computational processes involved. We believe coalgebra --- a unifying mathematical framework for computation, state-based systems and decomposition, for which Silva is a leading contributor and exponent --- can be applied to this end. Deduction is essentially captured by inductive constructions, but reduction is captured through the coalgebraic technique of coinduction, which decomposes goals down into subgoals.

Existing work shows that coalgebra generalizes truth-functional semantics and can represent basic aspects of search spaces. We will systematize this work to logics in full generality and, by utilizing the coalgebraic approach to the modelling of computation, also capture the control procedures required for proof-search. The algebraic properties of coalgebra should ensure that all aspects of this modelling, including the definitions of logics, their search spaces, and their search procedures, will be compositional.

Beyond this advance on the state of the art in semantic approaches to proof-search,

we can hope to utilize coalgebraic presentations of computation to achieve much more. By interfacing coalgebraic models of proof-search with coalgebraic models of, for example, probabalistic computation or programming languages, we can hope to give a clean, generic and modular presentation of applications of the reductive logic viewpoint as diverse as inductive logic programming and abduction-based Separation Logic tools such as Facebook's Infer.

Abstracting the key features of such systems into a modular semantic framework can help with more than simply understanding how existing tools work and can be improved. Such a framework can also guide the design and implementation of new tools. Thus, in tandem with our theoretical development, we will develop efficient, semantically driven automated reasoning support with wide application. In doing so we can thus hope to implement tools capable of deployment for a large range of reasoning problems and guide the design of theorem provers for specific logics.

Not only does this picture more closely resemble the way in which mathematicians actually prove theorems and, more generally, the way in which people solve problems using formal representations, it also encapsulates diverse applications of logic in computer science such as the programming paradigm known as logic programming, the proof-search problem at the heart of AI and automated theorem proving, precondition generation in program verification and more. It is also reflected at the level of truth-functional semantics --- the perspective on logic utilized for the purpose of model checking and thus verifying the correctness of industrial systems --- wherein the truth value of a formula is calculated according to the truth values of its constituent parts.

Despite the reductive viewpoint reflecting logic as it is actually used, and in stark contrast to deductive logic, a uniform mathematical foundation for reductive logic does not exist. Substantial background is provided by the work of Pym, Ritter, and Wallen, but this is essentially restricted to classical and intuitionistic logic and, even then, lacks an explicit theory of the computational processes involved. We believe coalgebra --- a unifying mathematical framework for computation, state-based systems and decomposition, for which Silva is a leading contributor and exponent --- can be applied to this end. Deduction is essentially captured by inductive constructions, but reduction is captured through the coalgebraic technique of coinduction, which decomposes goals down into subgoals.

Existing work shows that coalgebra generalizes truth-functional semantics and can represent basic aspects of search spaces. We will systematize this work to logics in full generality and, by utilizing the coalgebraic approach to the modelling of computation, also capture the control procedures required for proof-search. The algebraic properties of coalgebra should ensure that all aspects of this modelling, including the definitions of logics, their search spaces, and their search procedures, will be compositional.

Beyond this advance on the state of the art in semantic approaches to proof-search,

we can hope to utilize coalgebraic presentations of computation to achieve much more. By interfacing coalgebraic models of proof-search with coalgebraic models of, for example, probabalistic computation or programming languages, we can hope to give a clean, generic and modular presentation of applications of the reductive logic viewpoint as diverse as inductive logic programming and abduction-based Separation Logic tools such as Facebook's Infer.

Abstracting the key features of such systems into a modular semantic framework can help with more than simply understanding how existing tools work and can be improved. Such a framework can also guide the design and implementation of new tools. Thus, in tandem with our theoretical development, we will develop efficient, semantically driven automated reasoning support with wide application. In doing so we can thus hope to implement tools capable of deployment for a large range of reasoning problems and guide the design of theorem provers for specific logics.

### Planned Impact

The impact of this project will be primarily academic, thus we expand further on the communities highlighted in Academic Beneficieries. Correspondingly, we identify five specific kinds of academic impact.

1. In formal or mathematical logic. The establishment of the reductive view of logic --- which corresponds to much of the use of logic as a practical reasoning tool --- as a first-class component in the landscape of logical theory, with the first-steps in a comparable meta-theory.

2. In philosophical logic. Authors such as Martin-Lof, Prawitz, Sundholm, Negri and, latterly, Negri and Van Plato, have considered the proof-theoretic basis for the meanings of logical connectives and the justifications of the logical laws (rules). The essential point is that natural deduction rules and systems of rules, subject to certain design principles, have sufficient inductive structure to define fully the meanings

of operators (connectives, modalities, quantifiers) and proofs.

The shift to a conceptually rigorous view of reductive logic raises the question, for instance, of what is the impact of concepts such as indeterminacy and control on the meaning of the operators; for example, the input-output model of resource distribution renders multiplicative conjunction as being, to some extent at least, essentially non-commutative.

3. In coalgebraic theory. While we expect to be users rather than developers of coalgebraic theory, we can expect that out work will broaden and deepen the general understanding of the applicability of coalgebraic methods in logic, push the theory well beyond its starting point in Kripke semantics and transition systems for process and action logics.

4. In program logic and verification. Our work will provide a basis for a conceptual systematization of the now-large space of separation logics

which lacks a coherent foundation. As a consequence, we can expect to provide a basis for a uniform framework for designing tools (such as verifiers, analysers, and model checkers) for these program logics.

5. In models of computation. The canon of work in programming language semantics, in particular in the denotational tradition/approach for imperative and functional languages, couches much of its theory these days in terms of monads/comonads. Our coalgebraic approach to models of computation based on reductive logic --- and on proof-search in particular --- will connect their semantics to the traditional approach.

In order to promote this work to these communities we plan to host a major program semantics/coalgebra conference and/or workshop at our institute during the period of the grant (for example MFPS-CALCO). We also plan a textbook building on the work in the research monograph "Reductive Logic and Proof-Search: Proof Theory, Semantics and Control" by the PI Pym and Ritter but instead pitched at a postgraduate audience and advocating the new coalgebraic approach. We plan also three technical workshops over the life of the project.

We also believe the research we carry out can have industrial impact, including in the design of verification tools in industry: for example, extracting design principles from Facebook's Infer to implement similar tools for concurrent separation logic.

1. In formal or mathematical logic. The establishment of the reductive view of logic --- which corresponds to much of the use of logic as a practical reasoning tool --- as a first-class component in the landscape of logical theory, with the first-steps in a comparable meta-theory.

2. In philosophical logic. Authors such as Martin-Lof, Prawitz, Sundholm, Negri and, latterly, Negri and Van Plato, have considered the proof-theoretic basis for the meanings of logical connectives and the justifications of the logical laws (rules). The essential point is that natural deduction rules and systems of rules, subject to certain design principles, have sufficient inductive structure to define fully the meanings

of operators (connectives, modalities, quantifiers) and proofs.

The shift to a conceptually rigorous view of reductive logic raises the question, for instance, of what is the impact of concepts such as indeterminacy and control on the meaning of the operators; for example, the input-output model of resource distribution renders multiplicative conjunction as being, to some extent at least, essentially non-commutative.

3. In coalgebraic theory. While we expect to be users rather than developers of coalgebraic theory, we can expect that out work will broaden and deepen the general understanding of the applicability of coalgebraic methods in logic, push the theory well beyond its starting point in Kripke semantics and transition systems for process and action logics.

4. In program logic and verification. Our work will provide a basis for a conceptual systematization of the now-large space of separation logics

which lacks a coherent foundation. As a consequence, we can expect to provide a basis for a uniform framework for designing tools (such as verifiers, analysers, and model checkers) for these program logics.

5. In models of computation. The canon of work in programming language semantics, in particular in the denotational tradition/approach for imperative and functional languages, couches much of its theory these days in terms of monads/comonads. Our coalgebraic approach to models of computation based on reductive logic --- and on proof-search in particular --- will connect their semantics to the traditional approach.

In order to promote this work to these communities we plan to host a major program semantics/coalgebra conference and/or workshop at our institute during the period of the grant (for example MFPS-CALCO). We also plan a textbook building on the work in the research monograph "Reductive Logic and Proof-Search: Proof Theory, Semantics and Control" by the PI Pym and Ritter but instead pitched at a postgraduate audience and advocating the new coalgebraic approach. We plan also three technical workshops over the life of the project.

We also believe the research we carry out can have industrial impact, including in the design of verification tools in industry: for example, extracting design principles from Facebook's Infer to implement similar tools for concurrent separation logic.