Get Complete Project Material File(s) Now! »

## Structural proof theory

**Concept of proof**

Logic—the scientific study of reasoning and deduction—has the concept of proof at its core: how is new knowledge created from what is already known? One presents the fruit of deduction as a proof, but what exactly is a proof, and how can we recognize one, that is, ensure that a claimed proof is correct? Our modern understanding of the concept harkens back to David Hilbert’s vigorous eﬀorts to infuse all of mathematics with complete and absolute (metamathematical) rigor—formalized by means of the kind of axiomatic proof systems which were pioneered by such bodies of work as Gottlob Frege’s Begriﬀsschrift and Alfred North Whitehead and Bertrand Russell’s Principia Mathematica. These same eﬀorts sparked foundational controversies with mathematicians like Frege and L. E. J. Brouwer, and spurred Kurt Gödel to develop his incompleteness theorems— which demonstrated that Hilbert’s lofty ambitions, at least in their original form, were unattainable. The study of proofs as first-class mathematical objects developed in close connection with these advances.

Vis-à-vis the purely mathematical conception of proof are the philosophical and sociological faces of argumentation as a mental activity and an act of com-munication. It is recognized that proof has a dual nature by which it can be seen alternatively as proof-as-message or as proof-as-certificate. Taken as a message, the purpose of a proof is the transmission of insight and understanding between mathematicians: to convey the lines of argumentation followed to arrive at a con-clusion and to convince of the truth of that conclusion—the focus is on meaning, on semantics. Seen instead as a certificate, the purpose of proofs is the transmission and mechanical verification of knowledge: the use of the symbols and rules of a formal language to unambiguously derive new, correct phrases—the focus here is on syntax. In practice, both functions are closely related, and although formal study tends to emphasize the side of proof-as-certificate, proof-as-message consid-erations will be pervasive in much of the work that follows (especially throughout Part III, where proof assistants and user interaction are key subjects).

The rest of the chapter is organized as follows: Section 2.2 traces the devel-opment of proof theory and describes its major formal systems. Section 2.3, in parallel, outlines the most important division in the taxonomy of standard logics, that of separating classical and intuitionistic logics, both of which will be called upon over the course of our study. Section 2.4 presents the sequent calculus, one of the major deductive systems, on which our formal studies will be based. Section 2.5 introduces the discipline of focusing, used to structure the proofs of the sequent calculus in more organized, abstract forms. Section 2.6 summarizes some important considerations on the relation between proof systems and the logics they model. Section 2.7 concludes the chapter.

### Evolution of proof theory

Proof theory is the branch of mathematical logic that studies proofs as objects of mathematics. It is widely acknowledged that the modern study of proof has its roots in the axiomatizing undertakings of Hilbert’s Program. Among the disci-plines of proof theory, structural proof theory studies the structure and properties of proofs (in the sense of proof-as-certificate of the previous section). The con-crete objects of its study are proof calculi: formalized systems of deduction where formulas and proofs are inductively defined objects, and the steps of deduction are carried out syntactically by the application of inference rules which transform formulas and in the process construct proof objects. There are three principal families of proof calculi, each of which will be presented and placed in its proper context in this section.

First, Hilbert systems take their name from the refined calculus developed by Hilbert for the advancement of his Program. They reflect the longstanding tradition of organizing mathematical developments as a sequence of steps which starts from instances of a collection of logical axioms—for example, that every property P implies itself: P P —and follows by applications of inference rules which derive new facts from previously known ones. Logical reasoning has historically relied on this discursive style, from Aristotle to Gottlob Frege—after whom these calculi are sometimes named Frege systems, as we shall reference in Chapter 11. Often, a single inference rule, that of modus ponens, is used. This rule states that if we know that a fact Q is implied by another fact P , and we also know that P holds, then we can infer that Q holds. For this reason (and closer to the uniform terminology that will be used shortly) modus ponens is also called implication elimination and depicted schematically as follows: P Q P

Second, natural deduction systems arose as a response to the linear and unstruc-tured proofs of Hilbert-style systems, to better reflect the “natural” way in which proofs are built by mathematicians—in fact, although Hilbert systems have later been inspected under the lens of structural proof theory, it is in natural deduction that the discipline has its proper genesis. Natural deduction was developed by Gerhard Gentzen (1935) in his landmark dissertation with the goal of accurately reflecting the mental process of reasoning and its dependencies. Centrally, it employs the concept of assumptions made over the course of a proof attempt and which may be closed at some later point. An important question in this more structured system is whether there exist normal forms of natural deduction proofs, so that many shallowly diﬀerent derivations of the same property may be given a common representation. In his dissertation, Gentzen attempted to prove such a normalization property, succeeding in the intuitionistic case but failing in the classical case. Eventually, Dag Prawitz (1965) succeeded in doing so within the edifice of natural deduction, and we now know Gentzen himself persevered in his eﬀorts until the knot was untangled (von Plato, 2008).

The third and last great family of deductive systems is the sequent calculus, which Gentzen developed to work around the diﬃculties of the proof of nor-malization for natural deduction. Its technical motivation was to serve as a sort of meta-calculus in which the derivability relation of natural deduction could be expressed and reasoned about: the sequent calculus is more pedantic, but also more practical. In this formalism, an analog of normalization for sequent calculus proofs called cut elimination could be proved, and this result in turn ren-dered the original pursuit—i.e., a proof of normalization for intuitionistic natural deduction—unnecessary. Sequent calculi proved to be fertile theoretical ground, more mechanistic than their forebears and far more relevant in the looming age of computer science. They form the immediate substrate of the present work and will be discussed extensively in the pages that follow; after a brief orthogonal discussion, they will be properly introduced in Section 2.4.

**Classical and intuitionistic logics**

Perhaps the most fundamental division in modern logic concerns the distinction between the standard logics: classical logic and intuitionistic logic. Classical logic is a “logic of truth,” concerned with the assignment of truth values (say, true or false) to formal statements. The traditional study of logic starting with Aristotle can be framed in this tradition. By contrast, intuitionistic logic is a “logic of construction.” It was developed from L. E. J. Brouwer’s philosophy, notably by Arend Heyting. In proving properties about mathematical objects, an intuitionistic proof provides a way to construct objects exhibiting the properties being proved. For example, to have a proof of “A and B,” we need separately a proof of A and a proof of B.

Under the classical interpretation, the negation of a statement is an assertion of its falsity. Conversely, the intuitionistic negation of a statement points to the existence of a counterexample. As an illustration, consider proof by contradiction, a standard proof technique commonly used in mathematics. A proof by contradic-tion of a property p starts by assuming that p does not hold and proceeds to arrive at a contradiction. Many commonplace results—such as the classic proof of the irrationality of p2—make use of this method. However, such non-constructive arguments are invalid in intuitionistic logic.

In practice, classical logic and intuitionistic logic can be related by disallowing in the latter the non-constructive parts of the former—that is, the principle of the excluded middle, which asserts that for every statement either itself or its negation is true. By virtue of this constraint, intuitionism rejects the aberrant “proofs” of classical logic which rely on non-constructive arguments. A priori, because intuitionistic logic can be defined as a restriction of classical logic, not only can intuitionistic logic not be stronger than classical logic, but it would moreover seem to be weaker—every intuitionistically provable theorem is classically provable, but there exist classical theorems which are not intuitionistically provable.

Nonetheless, the relation between the expressive powers of both logics is sub-tler than their hierarchical relation might suggest. In the settings of propositional and first-order logic, there exist translation functions such that, if a formula is a theorem of classical logic, its translation is a theorem of intuitionistic logic: such functions are called double-negation translations (Ferreira and Oliva, 2012)—in fact, what such a function does is make explicit each use of the excluded middle by means of an encoding based on double negations. (However, a formula is not intuitionistically equivalent to its translation, and no general mappings exist in the opposite direction.) This leads to the observation that—in these settings— intuitionistic logic is more expressive, or finer than classical logic: if a formula is classically but not intuitionistically provable, we can find another formula which is intuitionistically provable and classically indistinguishable from the original.

If classical logic can be seen as the logic of traditional mathematics, intuition-istic logic takes on the mantle of the logic of computer science. In the context of theorem proving, classical logic serves as the foundation of choice for reasoning in automated theorem provers, where a computer program attempts to find proofs of theorem candidates. In interactive theorem provers (or proof assistants), where the user drives the search for proofs, the foundational role is assumed by intuition-istic logic. Both classical and intuitionistic logics will be employed extensively as the bases for Part II and Part III, respectively.

#### Sequent calculus

This section introduces Gentzen’s original sequent calculus, the proof theoretical foundation of our investigations. We concentrate on the calculus for classical logic and reference intuitionistic logic when appropriate. Classical logic will be the focus of Part II, while Part III will adopt intuitionistic logic as its vehicle; additional background is given in Chapter 9.

Classical logic contains the familiar set of logical constants or connectives. In the propositional fragment, we have the nullary constants true ( t ) and false ( f ); the unary constant negation (or not, :); and the binary constants conjunction (or and, ^), disjunction (or or, _), and implication ( ). First-order logic extends propositional logic with the universal quantifier (8) and the existential quantifier

( 9). The standard classical equivalences establish the interrelations between these constants. We make no attempt to reduce the set of connectives to a small, or minimal, functionally complete set of operators.

Added to these logical constants will be a type signature of non-logical constants that will function as atomic propositions (or simply atoms). A literal is either an atom or a negated atom. In first-order logic, as quantifiers are introduced, atoms can be parameterized by terms, also part of the signature. Quantifiers bind names within their scope and can be instantiated by the operation of substitution—all the usual subtleties and caveats about free and bound variables, capture-avoiding sub-stitution, etc., apply here. Given a formula A, »t x…A designates the substitution of a term t for a (free) variable x in A. A more computational view of all these aspects is deferred until Chapter 4.

Sequent calculi are formal logic systems organized around the concept of sequents. In its basic form, a sequent combines two arbitrary lists of formulas separated by a turnstile, say:

A1; : : : ; Am ‘ B1; : : : ; Bn

The list of formulas to the left of the turnstile is called the left-hand side (LHS) or the antecedent of the sequent and abbreviated . The list of formulas to the right of the turnstile is called the right-hand side (RHS) or succedent of the sequent and abbreviated . The classical semantics of a sequent states that, if all the formulas in the left-hand side are true, at least one formula in the right-hand side is true, as well. Equivalently, the antecedent list represents a conjunction of formulas, and the succedent list represents a disjunction of formulas.

Every sequent calculus is presented as a collection of inference rules on sequents. Each inference rule has one conclusion below the line and any number of premises above the line—possibly complemented by provisos and side conditions. The in-ference rules are actually rule schemata: they are essentially “universally quantified” over all their variables (representing arbitrary formulas, list of formulas, etc.), so that any combination of values for those syntactic variables constitutes an instance of that inference rule. The rules of the standard sequent calculus for classical logic are presented in Figure 2.1. (These rules do not cover the nullary logical constants true and false, which can be trivially defined in terms of the other connectives.)

The rules of the calculus are organized in three distinct groups. First, logical rules, also called introduction rules, analyze each logical connective on both sides of the turnstile in the conclusion of the rule and relate the conclusion to the necessary premises for the introduction of the connective. Second, identity rules treat the (symmetric) cases where the same formula appears on both sides of the turnstile: the axiom rule in the conclusion; and the cut rule in the premises. And third, structural rules manipulate the structure of the sequent without inspecting its formulas: they are exchange, which reorders the components of the sequent; weakening, which introduces new formulas in the conclusion; and contraction, which makes copies of formulas in the premise. This presentation showcases the deep, remarkable symmetry of classical logic.

**Table of contents :**

**1 Introduction **

**I Logical foundations **

**2 Structural proof theory **

2.1 Concept of proof

2.2 Evolution of proof theory

2.3 Classical and intuitionistic logics

2.4 Sequent calculus

2.5 Focusing

2.6 Soundness and completeness

2.7 Notes

**3 Foundational Proof Certificates **

3.1 Proof as trusted communication

3.2 Augmented sequent calculus

3.3 Running example: CNF decision procedure

3.4 Running example: oracle strings

3.5 Running example: decide depth

3.6 Running example: binary resolution

3.7 Checkers, kernels, clients and certificates

3.8 Notes

**II Logics without fixed points **

**4 Logic programming in intuitionistic logic **

4.1 Logic and computation

4.2 Logic programming

4.3 Prolog

4.4 FPC kernels

4.5 Notes

**5 Certificate pairing **

5.1 Implicit and explicit versions of proof

5.2 Pairing of FPCs

5.3 Elaboration and distillation

5.4 Maximally explicit FPCs

5.5 Experiments

5.6 Notes

**6 Determinate checkers **

6.1 Trust and determinate FPCs

6.2 Functional checkers in OCaml

6.3 Verified checkers in Coq

6.4 Extraction of verified checkers

6.5 FPCs by reflection in Coq

6.6 Notes

**7 Unsatisfiability certificates **

7.1 Boolean satisfiability

7.2 Redundancy properties and shallow certificates

7.3 Resolution FPCs and traces

7.4 Unsatisfiability FPCs with cut

7.5 Cut-free unsatisfiability FPCs

7.6 Notes

**8 Certification of theorem provers**

8.1 Towards FPCs in the large

8.2 The automated theorem prover Prover9

8.3 Resolution certificate elaboration

8.4 Certification workflow

8.5 Analysis of results

8.6 The next 700 certificate formats

8.7 Notes

**III Logics with fixed points**

**9 Fixed points in logic**

9.1 Fixed points and equality as logical connectives

9.2 Focused sequent calculus

9.3 Augmentations and kernels

9.4 Nominal abstraction

9.5 Notes

**10 Proof search with fixed points **

10.1 Automating logic

10.2 Abella

10.3 FPC kernels

10.4 Examples

10.5 Notes

**11 Proof outlines **

11.1 Frege proofs

11.2 Case study

11.3 Certificate design

11.4 Logic support

11.5 Certificate families: simple outlines

11.6 Certificate families: administrative outlines

11.7 Experiments

11.8 Notes