2nd Workshop on Coq and Rewriting

LIX, Palaiseau, 22-24 September 2004


This workshop will be held in conjonction with the visit at LIX of several people from Jose Meseguer's team for the CNRS-Urbana cooperation project, and with a GDR-ALP meeting of the group "logic, algebra and computation". It is of interest to the Modulogic and Averroes projects.

Scope: The aim of this workshop is to get an idea of different directions of research on:

If you want to participate, send a mail before September 3rd to Frédéric Blanqui. The inscription is free and includes lunches. We provide no accomodation. You have to find an hotel in Paris.

List of participants

Workshop place: Laboratoire d'Informatique de l'Ecole Polytechnique, Palaiseau [access]
Tel: 01 69 33 40 73 (Evelyne Rayssac)


Wednesday 22 September

09:30 Welcome coffee

10:30 Session 1: automated theorem provers

10:30 Manuel Clavel (Madrid): integrating decision procedures in reflective rewriting-based theorem provers

We propose a design for the integration of decision procedures in reflective rewriting-based equational theorem provers. Rewriting-based equational theorem provers use term rewriting as their basic proof engine; they are particularly well suited for proving properties of equational specifications. A reflective rewriting-based theorem prover is itself an executable equational specification, which has reflective access to the rewriting engine responsible of its execution to efficiently accomplish rewriting-based equational proofs. This reflective access means that the built-in rewriting engine can be called with different rewriting commands. This opens up the possibility of interpolating calls to appropriate decision procedures in the midst of a rewriting-based proof step ---typically to solve a condition in the application of a conditional equation. To illustrate our proposal and show its feasibility, we explain how our reflective design ideas can be used to integrate a decision procedure for Presburger arithmetic in the ITP tool, a reflective rewriting-based equational theorem prover, written entirely in Maude, for proving properties of Maude equational specification.

11:10 Damien Doligez (INRIA): Zenon, a first-order prover with coq-checkable output [ps]

Zenon is a theorem prover based on the tableau method that can do proof reconstruction: once the proof is found, it can be output in a coq-checkable format. I will show how keeping track of the proof enables several optimizations of the proof search and allows it to use weakening and cut rules in a fully-automatic prover.

11:50 David Delahaye (CNAM): checking Zenon proofs in Coq [ps]

We present a tool to verify automatically in Coq proofs produced by Zenon. Zenon is a first order automated theorem prover based on a tableau-like method and which is intended to support the Focal system. The verification is made in two steps: first, the refutation rules of Zenon's method are translated in usual rules of sequent calculus with an explicit excluded middle rule; next, these new rules are encoded in Coq. Regarding the implementation, two versions have been carried out: one with a raw ouput and another with pretty-print using Coq's own pretty-print. The two versions are parts of the Zenon system and as the first version turns out to be quite efficient for huge examples (of the TPTP library for instance), the second one should be more appropriate when integrated to Focal (with smaller examples in general).

12:30 Lunch

14:00 Session 2: termination of rewriting

14:00 Santiago Escobar (Valencia): natural rewriting and narrowing for general term rewriting systems [pdf]

Although originally introduced as a theorem proving method to solve equational unification problems, narrowing was soon recognized as a key mechanism to unify functional and logic programming. In fact, in a declarative setting where programming and proving can be seamlessly integrated, narrowing serves as a unified mechanism for both evaluation and deduction. For narrowing to be an efficient evaluation mechanism, several lazy narrowing strategies have been proposed, although typically for the restricted case of left-linear constructor-based rewrite systems. These assumptions, while reasonable for functional programming applications, are too restrictive for a much broader range of applications to which narrowing can be fruitfully applied. In this paper, we propose an efficient lazy narrowing strategy called natural narrowing which can be applied to a larger class of rewrite systems without the above assumptions. This strategy is more efficient than all previously know strategies when applied to the class of left-linear constructor-based rewrite systems. An important consequence of this generalization is the wide range of applications that can now be efficiently supported by narrowing. We highlight a few such applications including symbolic model checking, theorem proving, and partial evaluation. What thus emerges is a general and efficient unified mechanism based on narrowing, that seamlessly integrates a very wide range of applications in programming and proving.

14:40 Claude Marché (Paris 11): proving termination of membership equational programs [ps]

Advanced typing, matching, and evaluation strategy features, as well as very general conditional rules, are routinely used in equational programming languages such as, for example, ASF+SDF, OBJ, CafeOBJ, Maude, and equational subsets of ELAN and CASL. Proving termination of equational programs having such expressive features is important but nontrivial, because some of those features may not be supported by standard termination methods and tools, such as Mu-Term, CiME, AProVE, TTT, Termptation, etc. Yet, use of the features may be essential to ensure termination. We present a sequence of theory transformations that can be used to bridge the gap between expressive equational programs and termination tools, prove the correctness of such transformations, and discuss a prototype tool performing the transformations on Maude equational programs and sending the resulting transformed theories to some of the aforementioned tools.

15:20 Benjamin Wack (LORIA): strong normalization in the simply-typed Pure Pattern Type System [pdf]

Pure Pattern Type Systems (PPTS) embed the lambda-calculus and the rewriting in a single formalism. They endow all the basic ingredients of the rewriting with a status of first-class citizens. PPTS terms can be typed in a collection of 8 type systems extending Barendregt's lambda-cube. The basic typing properties have already been studied, but strong normalization did remain an open problem until this year. I will present the formalism together with its type systems, and give a proof of strong normalization for the simply-typed system, by faithful translation into System F omega. Together with the consistency of normalizing terms (already proved before), this result makes PPTS a good candidate for a proof-term language integrating deduction and computation.

16:00 Break

16:20 Session 3: certification of termination

16:20 Sébastien Hinderer (LORIA): certification of termination proofs by polynomial interpretations [ps]

Proof assistants based on type theory can use only terminating functions. Hence, to be able to use complex function definitions, one has to provide a way to ensure their termination. To do this, external tools can be used, but one also would like to check the proofs they found in the proof assistant itself. To achieve this, it is necessary to formalize the criteria classicaly used by these external tools. We will start by describing the polynomial interpretation termination criterion we are interested in. We will then give some information about its formalization in the proof assistant Coq. We will continue by describing how the construction of a termination proof has been automated. Finally, an interface between Coq and CiME, allowing to automate the search for a polynomial interpretation for a given term rewriting system will be described.

17:00 Thierry Hubert (Paris 11): certification of termination proofs based on dependency pairs [ps]

In 1997, Arts and Giesl proposed new termination criteria based on the so-called dependency pairs, leading to powerful implementations of automated search for termination proofs, such as the one in the CiME rewrite toolbox. Due to the complex underlying algorithms (on graphs, on constraint resolution, etc.), the question of confidence into the obtained proofs becomes crucial: indeed, termination proofs become so complex that they cannot anymore be verified manually. We propose a method to generate traces from CiME termination proofs, into the form of Coq scripts, so that the termination of the original system is certified within Coq.

17:40 End


Thursday 23 September

08:50 Session 4: explicit substitutions

08:50 Emmanuel Polonovski (Paris 7): explicit substitutions calculi and their properties [ps]

During this talk I will present some explicit substitutions calculi, their properties, and the problems that gave birth to over 13 years of research. I will exhibit some key moments of those years and some recent developments.

09:30 François-Régis Sinot (LIX): lambda-calculus with director strings [pdf]

When talking about efficient implementations, one usually thinks of compilation, and of lots of minor (though clever) optimisations, which of course, in the end, may make a great improvement. However, a crucial step that should be preliminary to this work is first to choose a good reduction strategy -- good not only in number of beta-steps, but in the total amount of work to be done; and this is known not to be realized (yet ?) for Levy-optimal (ie in number of beta-steps) implementations of the lambda-calculus. Moreover, in proof assistants, which usually ensure strong normalization of the terms considered, a much wider range of reduction strategies may be considered than in programming languages, where reduction is usually forbidden under lambdas. Thus new strategies allowed to perform some reductions under abstractions, and thus to share this work in the event it would be duplicated, may be considered. We propose here such a strategy, called the Closed Reduction strategy. However, this strategy relies on testing for some subterms being closed or having certain free variables, and that's why we introduce the Director Strings calculus which is the general framework which provides an efficient implementation of the Closed strategy, by internalizing the information about free variables in the system itself. Moreover the Director Strings calculus is interesting for itself: it is completely nameless (there are not even indices), fully simulates the beta-reduction, enjoys (ground) confluence and preserves strong normalization. In this talk, we will try to avoid technicalities as much as possible and emphasize the intuitions behind. We will also conclude with experimental benchmarks.

10:10 Delia Kesner (Paris 7): first-order representation of higher-order rewrite systems

We define a formal encoding from higher-order rewriting into first-order rewriting modulo an equational theory E. In particular, we obtain a characterization of the class of higher-order rewriting systems which can be encoded by first-order rewriting modulo an empty equational theory. This class includes of course the lambda calculus. Our technique does not rely on the use of a particular substitution calculus but on an axiomatic framework of explicit substitutions capturing the notion of substitution in an abstract way. The axiomatic framework specifies the properties to be verified by a substitution calculus used in the translation. Thus, our encoding can be viewed as a parametric translation from higher-order rewriting into first-order rewriting, in which the substitution calculus is the parameter of the translation.

10:50 Break

11:10 Session 5: certification of matching

11:10 Antoine Reilles (LORIA): certification of matching compilation [pdf]

We propose a method for certifying the compilation of pattern (syntactic) into an if-language, manipulating term representations. We define a mapping between algebraic terms and these term representations. This mapping makes the if-language independant of any term library. A prototype is being implemented in TOM, to verify the work of the TOM compiler.

11:50 Evelyne Contejean (Paris 11): a certified AC matching algorithm

In this paper, we propose a matching algorithm for terms containing some function symbols which can be either free, commutative or associative-commutative. This algorithm is presented by inference rules and these rules have been formally proven sound and complete, and decreasing in the coq proof assistant while the corresponding algorithm is implemented in the CiME system. Moreover some preparatory work has been done in coq, such as proving that checking the equality of two terms modulo some commutative and associative-commutative theories is decidable.

12:30 Lunch

14:00 Session 6: implementation of rewriting

14:00 Benjamin Grégoire (INRIA-LIX): compilation of beta-iota-reduction

14:40 Olivier Hermant (INRIA-LIX): compilation of beta reduction and of rewriting rule in a weak evaluator [ps]

The ZAM has been implementd by Gregoire to have a faster reduction of the terms of the CIC. Our topic is to extend these results to make possible compilation of rewriting rules (for now without associativity-commutativity) in the Calculus of Constructions. We will mainly present the symbolic calculus and the normalization algorithm.

15:20 Pierre-Etienne Moreau (LORIA): compilation of AU-rewriting in TOM [pdf]

16:00 Break

16:20 Session 7: extensions of type theory

16:20 Nicolas Oury (Paris 11): extensionality in Coq

In the Coq proof assistant, computations --- such as 2+2=4 --- are made by the system and are transparent in the proofs. This results in shorter and easier proofs. Nevertheless, most equalities have to be proved and can not be used in computations. We want to add these equalities as rewrite rules in the reduction. Such a system is called extensional and is neither decidable nor normalizing. But its study is useful for a better understanding of some extensions of the system with rewrite rules. We study the conservativity of this system with respect to a slight extension of the Calculus of Inductive Constructions.

17:00 Frédéric Blanqui (LORIA): Calculus of Constructions with implicit congruence closure [ps]

It is commonly agreed that the success of future proof assistants will rely in their ability to incorporate computations within deductions in order to mimic the mathematician when replacing the proof of a proposition P by the proof of an equivalent proposition P' obtained from P thanks to possibly complex calculations. In this talk, I will present a new version of the Calculus of Constructions in which computations are based on Shostak's congruence closure algorithm. Besides the novelty of the problem itself, a major technical innovation of this work lies in the fact that the computation mechanism varies along the proof-checking. This is so because the computation operating on propositions is based upon the set of user hypotheses at hand, and therefore evolves with the proof context. Our main result shows that proof-checking remains decidable when considering the congruence closure of the user equational hypotheses on algebraic terms. I will also discuss various possible extensions opening quite new interesting perspectives from a theorem proving point of view.

17:40 End


Friday 24 September

08:50 Session 8: semantics

08:50 Mark-Oliver Stehr (Urbana): the Open Calculus of Constructions and its semantics

The Open Calculus of Constructions (OCC) integrates ideas from dependent type theories, equational logic, and rewriting logic. Intended applications range from the formalization of mathematics to the specification and verification of programs and systems. In this talk we explain the pragmatic motivation behind OCC, which is very different from that of foundational type theories. A classical set-theoretic proof-irrelevance semantics will be given for the predicative and impredicative instances of OCC, and the formal system, which uniformly presents the type system and the operational semantics, will be presented. Examples from programming and interactive theorem proving will be used to illustrate the capabilities of the most recent prototype implementation.

09:30 Germain Faure (LORIA): towards a denotational semantics for the rho-calculus [pdf]

The rho-calculus has been introduced to provide a uniform integration of rewriting and lambda-calculus. Up to now, this formalism has been studied only from the perspective of operational semantics. In this talk, we propose to take the point of view of denotational semantics by interpreting the rho-calculus in a universal Scott domain. We show that the standard D^infinity model of lambda-calculus is also a model of a fragment of the rho-calculus with ML-style pattern-matching, and where structures are considered modulo the ACI-theory. We finally discuss about the weaknesses of the model (treatment of errors, etc.) as well as the new insights it gives on specific constructions of the rho-calculus, such as structures and errors.

10:10 Benjamin Werner (INRIA-LIX): proof irrelevance in type theory

I will make some remarks about what happens when proof-irrelevance is added to the conversion rule of a type theory like Coq's. In particular, I will try to show that the resulting theory is quite closely related to the formalism of PVS.

10:50 Break

11:10 Session 9: lambda-mu-mu~-calculus

11:10 Pierre Lescanne (ENS Lyon): intersection and union types in the lambda-mu-mu~-calculus [pdf]

Joint work with Daniel Dougherty (Worcester Polytechnic Institute) and Silvia Ghilezan (U. of Novi Sad). The original lambda-mu-mu~ of Curien and Herbelin has a system of simple types, based on sequent calculus, embodying a Curry-Howard correspondence with classical logic. We introduce and discuss three type assignment systems that are extensions of lambda-mu-mu~ with intersection and union types. The intrinsic symmetry in the lambda-mu-mu~-calculus leads to an essential use of both intersection and union types.

11:50 René David (Savoie): forte normalisation de la mu-mu~-réduction [ps]

Je présente danc cet exposé, une preuve combinatoire de la forte normalisation de la mu-mu~-réduction du lambda-bar-mu-mu~-calcul de P.-L. Curien et H. Herbelin.

12:30 Lunch

14:00 Session 10: logic

14:00 Stéphane Lengrand (Paris 7): Barendregt's Cube in sequent calculus [ps]

By adding dependent types, polymorphism, and type constructors, we extend Herbelin's sequent calculus for intuitionistic propositional logic. As opposed to Gentzen-style sequent calculi, Herbelin's formalism is a permutation-free sequent calculus, as cut-free proofs are in one-to-one correspondence with the normal forms of natural deduction, which avoids redundency in proof-search. Its cut-elimination process simulates beta-reduction. The close relationship with natural deduction enables us to derive the strong normalisation property from that of a Pure Type System. Easily identified, the cut-free fragment is the natural space for proof-search: tactics are merely the bottom-up application of the typing rules, which is quite simpler than in natural deduction where elimination rules have to be considered.

14:40 Bruno Guillaume (LORIA): vector addition tree automata [pdf]

We introduce a new class of automata, which we call VATA (vector addition tree automata). These automata are a natural generalization of vector addition systems with states, which are themselves equivalent to Petri nets. Then, we prove that the decidability of provability in multiplicative exponential linear logic (which is an open problem) is equivalent to the decidability of the reachability relation for vector addition tree automata. This result generalizes the well-known connection existing between Petri nets and the !-Horn fragment of multiplicative exponential linear logic.

15:20 Khelifa Saber (Savoie): a completeness result for the second order classical deduction system [ps]

We give a completeness result for the second order classical deduction system. In order to prove the strong normalization by a semantic way, the adequation lemma is necessary. But conversely we have not an inverse lemma for all types. In our work we give such result for a class of types, called positives types.

16:00 Break

16:20 Session 11: rewriting

16:20 Jean-Pierre Jouannaud (LIX): higher-order rewriting revisited

A new comprehensive framework for Higher-Ordre Rewriting is introduced, which combines aspects of earlier defined frameworks with novel ones : function symbols and variables have both a type and an arity, and rules of functionnal type are admitted. The study of the Church-Rosser properties of HOR is done abstractly, by reducing the Church-Rosser property of a pair (R,S) where S is a left-linear convergent set of rules, rewriting operates on terms in normal-form by using pattern-matching modulo S to the joinability of certain critical pairs, under the assumption that the union of rewriting with R and normalisation with S terminates. These results also apply to Nipkow's higher-order rewriting and yield a generalization of Nipkow's results.

17:00 Romain Kervarc (ENS Lyon): Pure Type Systems, cut and explicit substitutions

Pure Type Systems are a general formalism allowing to represent many type systems -- in particular, Barendregt's lambda-cube, including Girard's system F, dependent types, and the calculus of constructions. We built a variant of pure type systems by adding a cut rule associated to an explicit substitution in the syntax, according to the Curry-Howard-de Bruijn correspondence. The addition of the cut requires the addition of a new rule for the substitution, with which we are able to show type correctness and subject reduction for all explicit systems. Moreover, we proved that the explicit lambda-cube obtained this way is strongly normalizing.

17:40 End

LIX INRIA CNRS GDR ALP Urbana