Get Complete Project Material File(s) Now! »

## Standard and Nested Proof Theory

This chapter is concerned with the basics of the » structural approach « to proof theory, and it provides background for the rest of the document, starting with the standard proof theory developped in the formalisms of natural deduction and the sequent calculus. In particular, our study of proof systems and their computational interpretations will have a strong emphasis on normal forms that can be obtained by a purely syntactic process of permutations of rule instances. The tools required to study permutations are defined, and the standard proof systems for intuitionistic logic are recalled, as they will form the basis for most of the developments of other chapters — those concerned with typed -calculi.

The non-standard contents of this chapter are the descriptions of two formalisms based on the » deep inference « methodology, called the calculus of structures and nested sequents, which are the setting in which we attempt to lay the foundations for a logical approach to refined computational models. The syntactic nature of the proof objects using nested deduction, where are not applied only at toplevel as in shallow formalisms, is such that more permutations of rule instances are allowed, yielding more » bureaucracy « but also more possibilities for the design of rules.

The basic definitions are given, as well as tools required to handle permutations, which can be more complex in this setting than in shallow formalisms, in both the calculus of structures and nested sequents, and with examples from classical logic.

### Proofs and Standard Logical Formalisms

In the early times of formal logic, the notion of proof was not providing any object to be manipulated easily, due to the lack of a simple syntax, which would be rich enough to describe the fundamental structure of deductive reasoning. Indeed, the only important structure in deductive logic was for long the modus ponens scheme, which is, informally, the following rule: if implies and holds, then holds where and are any valid logical propositions. For example, the so-called Hilbert systems [BH34] usually have only this rule as an inference rule, and can represent different logics using different sets of axioms. All theorems are then derived from axioms by composition, using this rule. In such a setting, it is difficult to develop a theory of proofs as mathematical objects, and this reflects the main interest in the field at the time, which was to know whether a proposition is provable, rather than knowing if there are different proofs of the same proposition and how they relate.

The situation changed radically when Gentzen established the foundations of structural proof theory, by introducing the formalisms of natural deduction and the sequent calculus, in his seminal papers [Gen34, Gen35]. In these formalisms as in other modern logical formalisms, proof objects are part of a well-structured theory, organised in layers, from the level of logical propositions to the level of inference, providing a syntax for the result of a proof construction process. We provide here a description of these layers, as well as meta-level observations on this theory.

**Logical Formulas and Judgements**

In the propositional setting, we have to assume given an infinite, countable set A of atoms, denoted by small latin letters such as a, b and c. When needed, we will also assume given a bijective function : A 7! Acalled negation, such that a 6= a for any a. Atoms are meant to represent basic logical propositions that can be formed in the language of discourse. In a first-order setting, with quantifiers, this set must be generalised to a set of infinitely many predicates of all possible arities, which can be applied to terms, as usually defined in the literature [TS96].

The language of a logic is formed, based on atoms or predicates, by connectives used to compose propositions into complex ones. Connectives can have different arities — although they are usually considered to have either zero, one, two, or an arbitrary and variable number of arguments — and nullary connectives are called units, because they are used to represent the units of other connectives.

**Definition 1.1.** A formula is an expression built from atoms, connectives and units, viewed through its syntax tree, where inner nodes are connectives and leaves are atoms or units — and a node formed by some n-ary connective has n child nodes.

Given a particular logic, its logical language of formulas is usually given by a recursive grammar. Similarly, we can define formulas inductively, accepting atoms and units as valid formulas, and considering connectives as functions from formula tuples to formulas. We denote formulas by capital latin letters such as A, B, C.

**Example 1.2.** In classical logic, we have the two connectives ^ and _ which represent conjunction and disjunction respectively, so that we can write the formula A_(B ^ C), which should be read as » either A holds or both B and C hold, or all of them hold « given some formulas A, B and C.

In some situations, it might prove useful to consider that the connectives used in a logic have good properties, such as associativity and commutativity. This can be done for example by using a congruence relation on formulas and then dealing with the equivalences classes. Indeed, the intended meaning of connectives usually matches these properties, and we can then write, in classical logic, A_ B _ C rather than (A _ B) _ C or A _ (B _ C), or any commutative variant of these. This relates to the use of n-ary connectives, since the formula A using n-ary disjunction is a way of avoiding the question of associativity. W i

It is also common to extend the negation function to formulas, using a bijection between formulas, such as : in classical logic. It can then interact with negation as defined for atoms, if : a = a for any a. Moreover, it defines important dualities between two connectives, as between the conjunction and disjunction connectives in classical logic1, where :(A _ B) = :A ^ : B and :(A ^ B) = :A _ : B.

During the process of validating a given formula with respect to a certain logic, showing it is a theorem of this logic, we can keep track of which parts of this formula have been treated and which ones are left to be handled. Parts of the initial formula are then often organised in two groups: subformulas being considered as valid and used as assumptions, and other subformulas left to validate, the goal of the proof construction process. This is formalised through the following notion.

Definition 1.3. A sequent is a pair ‘ where and are possibly empty multisets of formulas, and is called the antecedent of the sequent, and its succedent.

We are using multisets here to simplify definitions and notations, but one should notice that plain sets can also be used, and some logics require sequents to use lists, such as non-commutative [AR99] or cyclic logics [Yet90]. A sequent is sometimes called a judgement, as it is a statement on the acceptability of a formulas.

**Inference Rules, Proofs and Systems**

Now that the level of logical propositions is defined, we can proceed to the level of inference — the act of deducing, from a set of premises, that a given conclusion is valid. The rich structure of modern logical formalisms comes from the syntax they provide for the various deduction mechanisms available in a logic. Indeed, we can define a syntax for one inference step following a valid scheme, as shown below.

An inference rule is given as a scheme, where variables are meant to be replaced with actual formulas and multisets of formulas. A rule having no premise is called an axiomatic rule. An instance of an inference rule is then any instantiation of the scheme associated to the rule. The meaning of a rule as described in the definition above is that from a set of sequents of the shape described in the rule, and accepted as premises, we can conclude that another sequent of the shape described by the conclusion of the rule is valid in the considered logic.

The point of using such inference rule instances is that they can be composed, in a syntactic way, to represent the causal chain that justifies the validity of a sequent with the validity of other sequents, or axioms. The fundamental object of structural proof theory is then such a composition, which represents a flow of inference steps, a complex deduction. Because of the branching structure induced by the possibility of having several premises in a rule, it is represented as a tree. Such a derivation will be denoted by the letter D, possibly with indices.

**Definition 1.6.** A derivation is a rooted tree where nodes are inference rule instances, such that for any node v with a parent w, there is one premise in w which is equal to the conclusion of v.

This can lead to the definition of highly complex inference rules, more difficult to deal with, but which have benefits when it comes to the definition of normal forms to represent classes of proofs considered as equivalent by one proof which is designated as the canonical representant [Cha08]. We will always use the notation with a double inference line, as above, to indicate that a rule is a synthetic inference rule — or any compound inference step.

Although derivations are the central object of the theory, most of the work done in the field has been concentrated on a particular class of derivations, those having no premises, thus representing the result of a completed proof construction process.

**Definition 1.8.** A proof is a derivation where all leaves are axiomatic instances.

The final layer to define here is the representation of the logical setting, defining which deductive steps are considered valid and which ones are not. Just as Hilbert systems are defined by a set of axioms, a system in structural proof theory is defined by the set of inference rules. The connection to the more general notion of logic — as a set of theorems, being a subset of all propositions that can be built on a given language — goes through the proof construction process, since one has to build a proof for a given proposition using only this set of inference rules to show that it is indeed a valid theorem.

**Definition 1.9.** A proof system S is a set of inference rules, and it is a system for a given logic L when for any formula A in the language of L , there is a proof of ‘ A in if and only if A is a theorem of L .

A derivation is described as a derivation in the system S if it is built only using inference rules from the S proof system. Notice that there are two directions in the correspondence between a proof system and some logic. In the first one, where the existence of a proof for the formula A in the system S implies that A is a theorem of the logic L , we write that S is sound with respect to L . In the other direction, where there exists a proof of A in S for any given theorem A of L , we write that is complete with respect to L . Since any proof system defines a logic, as the set of all formulas for which there one can build a proof, these notions of soundness and completeness can also be used between proof systems.

Natural deduction and sequent style. The definition given here for inference rules, and the use of trees to represented derivations, is common grounds for most of traditional logical formalisms. Restrictions can then be imposed to obtain good properties, and for example the sequent calculus formalism emphasises the use of analytic inference rules, those where all formulas in the premises are subformulas of formulas in the conclusion — and usually, only the cut rule does not conform to this standard, and is therefore eliminated2.

#### Logical Flow and the Structure of Proofs

The step from proof theory, in the tradition of Hilbert and Gödel, to structural proof theory in the style of Gentzen and Prawitz is to consider in details the structure of derivations, and in particular what rules are used, on which occurrences of formulas, to find out possible transformations of this structure. This has become a center of interests in many works dedicated to fine-grained proof systems.

A useful tool in this regard is the notion of logical flow graph [Bus91], an idea that can also be found in studies of substructural logics such as linear logic [Gir87], used to track down particular occurrences of a formula and study how it spreads in a derivation through the application of inference rules. Following the definition given by Buss, we consider occurrences4 of formulas or subformulas.

Definition 1.10. A particle A of a derivation D is an occurrence of a formula or of a subformula A that appears in a sequent within D.

In order to use these occurrences, we will have to compare them, but there is a difficulty if two occurrences of the same formula appear in the same sequent, since there is no fixed order in a multiset. Therefore, we consider that a given rule instance attributes an index i 2 N to each formula and subformula in its premises and conclusion, which respects the scheme given by the inference rule. This index is a part of the information that one can extract from a given particle.

**Table of contents :**

**Preliminaries**

**1 Standard and Nested Proof Theory**

1 Proofs and Standard Logical Formalisms

1.1 Logical Formulas and Judgements

1.2 Inference Rules, Proofs and Systems

1.3 Logical Flow and the Structure of Proofs

1.4 Permutations of Rule Instances

2 Standard Intuitionistic Systems

2.1 The Natural Deduction System NJ

2.2 Detour Elimination

2.3 The Sequent Calculus LJ

2.4 Cut Elimination

3 Deep Inference and Nested Proof Systems

3.1 The Calculus of Structures

3.2 Nested Sequents

3.3 Logical Flow and Permutations

3.4 Normal Forms in Nested Proof Systems

**2 Logical Foundations for Computation **

1 Proof Normalisation as Functional Computation

1.1 Natural Deduction and Typed -terms

1.2 Detour Elimination as -reduction

2 Cut Elimination and Explicit Substitutions

2.1 The -calculus with Explicit Substitutions

2.2 Cut in Natural Deduction and Explicit Substitutions

2.3 The -calculus with Pure Explicit Substitutions

2.4 The Sequent Calculus and Pure Explicit Substitutions

3 Linear Logic and Resources

3.1 A Linear Decomposition of Classical Logic

3.2 Fragments of Linear Logic

3.3 Computational Significance

4 Proof Search as Logical Computation

4.1 Computational Interpretations of Formulas

4.2 Normal Forms and Proof Search

**3 Intuitionistic Logic in Nested Sequents **

1 Intuitionistic Nested Sequent Systems

1.1 Basic Definitions

1.2 A Family of Intuitionistic Proof Systems

1.3 Correspondence to the Sequent Calculus

2 Cut Elimination

2.1 Preliminaries

2.2 Weak Linearisation

2.3 Merging Proofs

2.4 Eliminating Cuts

3 Local Normalisation

**4 Intuitionistic Logic in the Calculus of Structures **

1 A System in Sequent Style

1.1 Basic Definitions

1.2 Correspondence to the Sequent Calculus

1.3 Cut Elimination and Normalisation

2 A System in Natural Deduction Style

2.1 Basic Definitions

2.2 Correspondence to Natural Deduction

3 Detour Elimination

**5 Nested Typing for Explicit Substitutions**

1 Typing with Nested Sequents

1.1 Nested Typing Judgements

1.2 Nested Typing for Pure Explicit Substitutions

1.3 Properties of Nested Typing with Sequents

2 Cut Elimination as Reduction

2.1 Reduction in the x-calculus

2.2 Reduction in Other Calculi

3 Typing with the Calculus of Structures

3.1 Uniform Typing Structures

3.2 Uniform Typing for -calculi with Explicit substitutions

3.3 Properties of Nested Typing with Structures

4 Detour Elimination as Reduction

4.1 Reduction in the x-calculus

4.2 Reduction in Other Calculi

**6 Nested Typing and Extended -calculi**

1 Contraction and Resources

1.1 Contraction in Nested Proof Systems

1.2 Syntax of the r-calculus

2 Reduction in the r-calculus

2.1 Operational Properties of r

2.2 Nested Typing for r

3 Switch and Communication

3.1 The Decomposition of Context Splitting

3.2 Syntax of the c-calculus

4 Reduction in the c-calculus

4.1 Operational Properties of c

4.2 Nested Typing for c

**7 Nested Focusing in Linear Logic**

1 Linear Logic in the Calculus of Structures

1.1 The Symmetric Linear System SLS

1.2 Correspondence to the Sequent Calculus

1.3 From Equations to Inference Rules

2 Systems with Explicit Polarities

2.1 Polarised Formulas and Calculi

2.2 The Polarised System LEP

3 From Polarities to Focusing

3.1 The Focused System LEF

3.2 The Grouped System LEG

4 Completeness and Relation to Sequent Calculi

4.1 Internal Proof of the Focusing Property

4.2 Correspondence to Standard Focusing

**8 Proof Search as Reduction in the -calculus **

1 Proof Search as Rewriting

2 A Restricted Intuitionistic System

2.1 Restrictions on Structures and Rules

2.2 Termination and the JSLb Proof System

2.3 Closing JSLb Proofs

3 Encoding Reduction in Proof Search

3.1 Computational Adequacy

3.2 Search Strategies and Evaluation Order

4 Focused Proof Search and Strategies

4.1 The JSLn Focused System

4.2 Focusing as Big-step Computation