Seminars of the Protheo team

The Protheo team organizes a seminar about every week, generally on Tuesday 2 pm and in room A006. For any question, please contact Clément Houtmann.
Mardi 8 Janvier 2008 à 14h00
Zenon: a practical theoretical introduction to (classical) tableaux methods, Richard Bonichon

Tableaux methods have been used since the 1930s to define, study and implement automated theorem provers (ATP) for various logics. Although today's top-of-the-shelf ATP sytems are more often resolution-based, using tableaux still offer some advantages, even in the context of classical logic, as we will try to show. In this talk, we will first recall the basic theoretical notions and notations regarding tableaux methods. We will try and highlight the deep links between tableaux, sequent calculus and cut elimination. Then we will present the current state of Zenon, a tableau-based automated theorem prover. Its main claim to fame is to produce checkable proofs for first-order classical logic with equality. We will show some of its internals and how one goes from writing a proof input in Focal Proof Language to checking its output with Coq.

Lundi 17 décembre 2007 à 14h00, salle C103
Semantic labeling for proving termination of term rewriting, Adam Koprowski

Semantic labeling was invented in 1995 by Zantema. It is a technique for proving termination of term rewriting by attaching semantics to function symbols, deriving labels from it and using this additional information in termination proofs. At the beginning it was implemented in automatic termination tools only for finite domains. Later on this approach was extended (Koprowski, Zantema) to infinite models in the termination prover TPA, which, most notably, allowed to prove termination of the difficult SUBST system. In a recent line of research the technique has been improved, giving rise to predictive labeling (Hirokawa, Middeldorp) which was generalized to dependency pairs setting (Koprowski, Middeldorp) and to innermost termination (Thiemann, Middeldorp). In this talk I will try to give an overview of those developments with a special emphasis on automation for use in automatic termination provers.

Vendredi 14 décembre, entre 10h30 et 12h30, salle A008
The Essence of Safe Composition, Delia Kesner

Many calculi with explicit substitutions enjoy full composition, but some of them do not enjoy strong normalisation (this result is known as Mellies' counter-example). However, full composition and strong normalisation can live together, leading to a notion of safe composition. The available proofs of strong normalisation for calculi with safe composition are long and tedious because they are indirect: one simulates reduction of the calculus into another strongly normalising reduction relation. Self-contained arguments to prove normalisation properties of calculi with safe composition are important, particularly, when integrating them inside other richer frameworks such as for example type theory. This would reveal the essence of safe composition. In this talk we survey some fundamental aspects of the theory of explicit substitutions and we explain some key characteristics of safe composition such as those in the calculi of the cube of resources, Milner's lambda-calculus for reactive systems, etc. We provide a simple proof of strong normalisation for a simple calculus obtained by extending lambda-x with just one reduction rule for composition and one equation for commutation of independent substitutions. The proof uses standard techniques based on a simple perpetuality strategy which allows to give an inductive characterisation of strongly normalising terms. Then arithmetical or reducibility arguments allow to conclude.

Vendredi 14 décembre, entre 10h30 et 12h30, salle A008
Session Types and Object Oriented Programming, Mariangiola Dezani

We suggest an amalgamation of communication based programming (centred on sessions) and object oriented programming, whereby sessions between concurrent threads are amalgamated with methods. In our proposal threads consist of the execution of session bodies on objects and communicate with each other through asynchronously sending/receiving objects on channels. The choice on how to respond to a session request is based on the name of the request and the class of the object receiving the request, the choice on how to continue a session is made on the basis of the class of the object sent/received. Sessions can be delegated to other sessions, although session themselves are not first class objects.

Mercredi 12 décembre à 10h30, salle A008
Spécialisation principale de types, Laura Lowenthal

La spécialisation de programmes est une sorte de génération automatique de programmes où, a partir d'un programme source général, on en produit des différentes versions résiduelles, chacune spécialisée à certains donnés connus. Par exemple, la fonction récursive "puissance", si on sait que l'exposant est 3, peut être spécialisée pour obtenir la fonction (plus performante, pas récursive) "puissance3 = fun x -> x * x * x". L'abordage le plus mûr de la spécialisation de programmes est fondé sur l'évaluation partielle (1): on remplace les arguments connus par le résultat de leur évaluation et on les combine avec les computations qu'on ne peut pas faire. C'est alors un mélange de computation et de génération de code qui produit le nouveau programme. La spécialisation des types (2), par contre, se base sur l'inférence des types. Le programme source tout comme son type sont spécialisés à un programme et à un type résiduels. La spécialisation principale des types (3) est une formulation détaillée de ce système basée sur la théorie des types qualifiés (4), avec la propriété de générer des "spécialisations principales": pour chaque expression et type sources, on produit une expression et un type résiduels qui sont "plus généraux" que toutes les spécialisations correctes. Lors de ma présentation, je vous raconterai un peu plus sur la spécialisation principale des types, les problèmes associés et la (tout petite) contribution que j'ai faite à ce système dans mon travail de master.

(1) Neil Jones et. al, Partial Evaluation and Automatic Program Generation. http://www.dina.dk/~sestoft/pebook/pebook.html
(2) John Hughes, Type Specialisation for the lambda-calculus, or A New Paradigm for Partial Evaluation based on Type Inference, http://citeseer.ist.psu.edu/51273.html
(3) John Hughes and Pablo Martínez López, Principal Type Specialization, http://portal.acm.org/citation.cfm?id=568184
(4) Mark Jones, Qualified Types: Theory and Practice, http://portal.acm.org/citation.cfm?id=207528

Jeudi 29 novembre à 10h30, salle A006
L'arithmétique de presburger: syntaxe, sémantique, complexité, Cody Roux

Nous présentons l'arithmétique de Presburger, une théorie des entiers naturels avec la seule addition, qui présente la particularité d'être décidable. Cependant tout algorithme de décision est nécessairement long: M.J. Fischer et M.O.Rabin démontrent en 1974 que toute procédure de décision "tourne" en temps (non déterministe!) au moins O(2^2^(cn)) avec c une constante et n la longueur de la formule. Nous donnons l'idée de la preuve de Fischer et Rabin, idée pouvant s'appliquer à d'autres théories, notamment les réels de Tarski.

Samedi 20 octobre à 10h30, salle A006
Preuve par récurrence dans le calcul des séquents modulo, Fabrice Nahon

Nous présentons une méthode originale de recherche de preuve par récurrence utilisant la surréduction. Elle a la particularité d'être fondée sur la déduction modulo et d'utiliser la surréduction pour sélectionner à la fois les variables de récurrence et les schémas d'instanciation. Elle donne également la possibilité de traduire directement toute dérivation effectuée avec succès en une preuve dans le calcul des séquents modulo. La correction et la complétude réfutationnelle de la méthode sont démontrées en théorie de la preuve.
Nous étendons ensuite cette première approche aux théories de réécriture équationnelles constituées d'un système de réécriture $\calR$ et d'un ensemble $E$ d'égalités. A partir du moment où le système de réécriture équationnel $(\calR,E)$ possède de bonnes propriétés de terminaison et de complétude suffisante, et si on suppose également que $E$ préserve les constructeurs, la surréduction au niveau des positions les plus profondes où apparaît un symbole défini s'effectue uniquement à l'aide d'unificateurs qui sont également des substitutions constructeurs. Ceci est particulièrement intéressant dans le cas des théories associatives, ou associatives commutatives, pour lesquelles notre système de recherche de preuve a été raffiné.

Mots clés: déduction modulo, calcul des séquents modulo, récurrence, récurrence noethérienne, récurrence par réécriture, raisonnement équationnel, réécriture de termes, réécriture équationnelle, surréduction équationnelle.

Mercredi 17 octobre à 9h15, salle C103
Une etude de cas sur l'adaptation et le développement rapide d'applications de vente en ligne multicanaux, Arnaud Bailly

Arnaud Bailly a effectué sa thèse à Lille, sous la direction de Mireille Clerbout. Il a depuis fondé la société OQube (http://www.oqube.com/). Ses centres d'intéret sont l'amélioration de la qualité des logiciels. Arnaud nous parlera d'une étude de cas ainsi que des besoins existants de programmation par règles (pattern-matching, Tom, stratégies, etc.)

Lundi 8 octobre à 14h00, salle A008
Weaving rewriting-based access control policies, Anderson Santana de Oliveira

Access control is a central issue among the overall security goals of information systems. Despite of the existence of a vast literature on the subject, it is still very hard to assure the compliance of a large existing system to a given dynamic access control policy. Based on our previous work on formal islands, we provide in this paper a systematic methodology to weave dynamic, formally specified policies on existing applications using aspect-oriented programming. To that end, access control policies are formalized using term rewriting systems, allowing us to have an agile, modular, and precise way to specify and to ensure their formal termination. These high-level descriptions are then weaved into the existing code, in a manner that the resulting program implements a safe reference monitor for the specified policy. For developers, this provides a systematic process to enforce dynamic policies in a modular and flexible way. Since policies are independently specified and checked to be later weaved into various different applications, the level of reuse is improved. We implemented the approach on test cases with quite encouraging results.

Lundi 17 septembre à 11h00, salle B011
Modular Access Control via Strategic Rewriting, Anderson Santana de Oliveira

Security policies, in particular access control, are fundamental elements of computer security. We address the problem of authoring and analyzing policies in a modular way using techniques developed in the field of term rewriting, focusing especially on the use of rewriting strategies. Term rewriting supports a formalization of access control with a clear declarative semantics based on equational logic and an operational semantics guided by strategies. Well-established term rewriting techniques allow us to check properties of policies such as the absence of conflicts and the property of always returning a decision. A rich language for expressing rewriting strategies is used to define a theory of modular construction of policies, in which we can better understand the preservation of properties of policies under composition. The robustness of the approach is illustrated on the composition operators of XACML.

Lundi 10 septembre à 14h00, salle A008
Terminaison en temps moyen fini de systèmes de règles probabilistes, Florent Garnier

Nous avons dans cette thèse cherché à définir un formalisme simple pour pouvoir modéliser des systèmes où se combinent des phénomènes non-déterministes et des comportements aléatoires. Nous avons choisi d'étendre le formalisme de la réécriture pour lui permettre d'exprimer des phénomènes probabilistes, puis nous avons étudié la terminaison en temps moyen fini de ce modèle. Nous avons également présenté une notion de stratégie pour contrôler l'application des règles de réécriture probabilistes et nous présentons des critères généraux permettant d'indentifier des classes de stratégies sous lesquelles les systèmes de réécriture probabilistes terminent en temps moyen fini. Afin de mettre en valeur notre formalisme et les méthodes de preuve de terminaison en temps moyen fini, nous avons modélisé un réseau de stations WIFI et nous montrons que toutes les stations parviennent à émettre leurs messages dans un temps moyen fini.

Jeudi 6 septembre à 14h00, salle B013
Unbounded Proof-Length Speed-up in Deduction Modulo, Guillaume Burel

In 1973, Parikh proved a speed-up theorem conjectured by Gödel 37 years before: there exist arithmetical formulæ that are provable in first-order arithmetic, but whose shorter proof in second-order arithmetic is arbitrarily smaller than any proof in first order. On the other hand, resolution for higher-order logic can be simulated step by step in a first-order narrowing and resolution method based on deduction modulo, whose paradigm is to separate deduction and computation to make proofs clearer and shorter. We prove that $i+1$th-order arithmetic can be linearly simulated into $i$th-order arithmetic modulo some confluent and terminating rewrite system. We also show that there exists a speed-up between $i$th-order arithmetic modulo this system and $i$th-order arithmetic without modulo. All this allows us to prove that the speed-up conjectured by Gödel does not come from the deductive part of the proofs, but can be expressed as simple computation, therefore justifying the use of deduction modulo as an efficient first-order setting simulating higher order.

Mercredi 4 juillet à 14h00, salle C103
Strong Normalization as Safe Interaction, Colin Riba

When enriching the lambda-calculus with rewriting, union types may be needed to type all strongly normalizing terms. However, with rewriting, the elimination rule (UE) of union types may also allow to type non normalizing terms (in which case we say that (UE) is unsafe). This occurs in particular with non-determinism, but also with some confluent systems. It appears that studying the safety of (UE) amounts to the characterization, in a term, of safe interactions between some of its subterms. In this paper, we study the safety of (UE) for an extension of the lambda-calculus with simple rewrite rules. We prove that the union and intersection type discipline without (UE) is complete w.r.t. strong normalization. This allows to show that (UE) is safe if and only if an interpretation of types based on biorthogonals is sound for it. We also discuss two sufficient conditions for the safety of (UE), and study an alternative biorthogonality relation, based on the observation of the least reducibility candidate.

Lundi 2 juillet à 14h00, salle C103
Principles of Superdeduction, Clement Houtmann

In predicate logic, the proof that a theorem P holds in a theory Th is typically conducted in natural deduction or in the sequent calculus using all the information contained in the theory in a uniform way. Introduced ten years ago, Deduction modulo allows us to make use of the computational part of the theory Th for true computations modulo which deductions are performed. Focussing on the sequent calculus, this paper presents and studies the dual concept where the theory is used to enrich the deduction system with new deduction rules in a systematic, correct and complete way. We call such a new deduction system "superdeduction''. We introduce a proof-term language and a cut-elimination procedure both based on Christian Urban's work on classical sequent calculus. Strong normalisation is proven under appropriate and natural hypothesis, therefore ensuring the consistency of the embedded theory and of the deduction system. The proofs obtained in such a new system are much closer to the human intuition and practice. We consequently show how superdeduction along with deduction modulo can be used to ground the formal foundations of new extendible proof assistants. We finally present lemuridae, our current implementation of superdeduction modulo.

Lundi 18 juin à 11h00, salle C103
Simple saturated sets for disjunction and second-order existential quantification Makoto Tatsuta

This paper gives simple saturated sets for disjunction and second-order existential quantification by using the idea of segments in Prawitz's strong validity. Saturated sets for disjunction are defined by Pi-0-1 comprehension and those for second-order existential quantification are defined by Sigma-1-1 comprehension. Saturated-set semantics and a simple strong normalization proof are given to the system with disjunction, second-order existential quantification, and their permutative conversions. This paper also introduces the contraction property to saturated sets, which gives us saturated sets closed under union. This enables us to have saturated-set semantics for the system with union types, second-order existential types, and their permutative conversions, and prove its strong normalization.

Mardi 12 juin à 14h00, salle A006
Confluence of Pattern-Based Calculi Germain Faure

In this talk, we propose a journey through pattern-based calculi: mainly the lambda calculus with patterns of Vincent van Oostrom, the pure pattern calculus of Delia Kesner and Barry Jay and the rewriting calculus of Claude Kirchner and Horatiu Cirstea. We will present the different motivations that led to these calculi and compare the resulting calculi. We will show that several approaches are used to obtain the confluence but that, in practice, the proof methods share the same structure and that each variation on the way pattern-abstractions are applied needs another proof of confluence. We propose here a generic confluence proof where the way pattern-abstractions are applied is axiomatized. Intuitively, the conditions guarantee that the matching is stable by substitution and by reduction. We show that our approach directly applies to different pattern calculi and we also characterize a class of matching algorithms and consequently of pattern-calculi that are suprisingly not confluent.

Mardi 5 juin à 14h00, salle A006
Syntactical models as a tool to understand the computational capabilities of a calculus Germain Faure

In this talk, we will not explicitly make use of categorical theory background materials. We will present on two examples (the lambda-calculus and the parallel lambda-calculus) how syntactical models can be a very useful and powerful tool to understand and make explicit the computational capabilities of a calculus.

Mardi 29 mai à 14h00, salle A006
Cut Elimination in Deduction Modulo by Abstract Completion Guillaume Burel.

Deduction Modulo implements Poincaré's principle by identifying deduction and computation as different paradigms and making their interaction possible. This leads to logical systems like the sequent calculus or natural deduction modulo. Even if deduction modulo is logically equivalent to first-order logic, proofs in such systems are quite different and dramatically simpler with one cost: cut elimination may not hold anymore. We prove first that it is even undecidable to know, given a congruence over propositions, if cuts can be eliminated in the sequent calculus modulo this congruence. Second, to recover the cut admissibility, we show how computation rules can be added following the classical idea of completion /a la/ Knuth and Bendix. Because in deduction modulo, rewriting acts on terms as well as on propositions, the objects are much more elaborated than for standard completion. Under appropriate hypothesis, we prove that the sequent calculus modulo is an instance of the powerful framework of /abstract canonical systems/ and that therefore, cuts correspond to critical proofs that abstract completion allows us to eliminate. In addition to an original and deep understanding of the interactions between deduction and computation and of the expressiveness of abstract canonical systems, this provides a mechanical way to transform a sequent calculus modulo into an equivalent one admitting the cut rule, therefore extending in a significant way the applicability of mechanized proof search in deduction modulo.

Jeudi 26 avril à 14h00, salle C103
Normalisation en déduction surnaturelle et en déduction modulo, Paul Brauner.

La déduction modulo et la déduction surnaturelle sont deux extensions de la logique des prédicats via des règles de calcul. Alors que l'application de ces règles est transparente en déduction modulo, elles sont utilisées pour construire de nouvelles règles d'inférence non logiques en déduction surnaturelle. Dans les deux cas, l'ajout de ces règles de calcul menace la forte normalisation des preuves; c'est pourquoi plusieurs critères assurant ces bonnes propriétés ont été énoncés. Après une avoir présenté une vue d'ensemble de l'état actuel des déductions modulo et surnaturelle, je présenterai le résultat du papier soumis à CSL avec Gilles Dowek : l'équivalence entre la forte normalisation des preuves dans les deux systèmes. Je finirai par une ouverture sur les directions futures de ces systèmes et les problèmes liés à leur éventuelle intégration.

Mardi 24 avril à 14h00, salle A006
Introduction to HKU's security center projects, Eric K. Wang (Hong Kong University).

With increasing emphasis being placed on the development of high tech industries in Hong Kong, there is a strong need for Hong Kong government to develop her own pool of experts in information security and cryptography. With this background, the Center for information Security and Cryptography (CISC), was established to meet the need of our society in this aspect.The center is housed in the Department of Computer Science, the University of Hong Kong,with the mission to promote basic and applied research in the areas of Information Security and Cryptography, and to facilitate the collaboration between the academic and the industry. The projects and research involved are mainly related to Authentication, key management for Wireless Sensor Network and Secure Preservation for Long-term Digital Materials. The topic I will present is introduction for several projects(Authentication Model, Key Management for Wireless Sensor Network and for Long-term Encrypted Documents) we have done before.

Mardi 24 avril à 14h45, salle A006
Comparaison de politiques d'accès, Mathieu Jaume (LIP6).

Les programmes polygraphiques définissent un modèle de calcul graphique permettant de décrire des programmes fonctionnels du premier ordre. On verra que ce modèle est Turing-complet. On utilisera ensuite des interprétations polygraphiques pour obtenir des bornes de complexité, ainsi qu'une nouvelle caractérisation de la classe PTIME des fonctions calculables en temps polynomial.

Joint work with Charles Morisset (LIP6).

Mardi 3 avril à 14h00, salle A006
Programmes polygraphiques, Yves Guiraud.

Les programmes polygraphiques définissent un modèle de calcul graphique permettant de décrire des programmes fonctionnels du premier ordre. On verra que ce modèle est Turing-complet. On utilisera ensuite des interprétations polygraphiques pour obtenir des bornes de complexité, ainsi qu'une nouvelle caractérisation de la classe PTIME des fonctions calculables en temps polynomial.

Mercredi 21 mars à 16h00, salle A006
Anti-Pattern Matching, Radu Kopetz. Join work with Claude Kirchner and Pierre-Etienne Moreau.

It is quite appealing to base the description of pattern-based searches on positive as well as negative conditions. We would like for example to specify that we search for white cars that are not station wagons. To this end, we define the notion of anti-patterns and their semantics along with some of their properties. We then extend the classical notion of matching between patterns and ground terms to matching between anti-patterns and ground terms. We provide a rule-based algorithm that finds the solutions to such problems and prove its correctness and completeness. Anti-pattern matching is by nature different from disunification and quite interestingly the anti-pattern matching problem is unitary. Therefore the concept is appropriate to ground a powerful extension to pattern-based programming languages and we show how this is used to extend the expressiveness and usability of the Tom language

On the Stability by Union of Reducibility Candidates, Colin Riba.

We investigate some aspects of proof methods for the termination of (extensions of) the second-order lambda-calculus in presence of union and existential types. We prove that Girard's reducibility candidates are stable by union iff they are exactly the non-empty sets of terminating terms which are downward-closed wrt a weak observational preorder. We show that this is the case for the Curry-style second-order lambda-calculus.As a corollary, we obtain that reducibility candidates are exactly the Tait's saturated sets that are stable by reduction. We then extend the proof to a system with product, co-product and positive iso-recursive types.

Vendredi 16 mars à 14h00, salle A006
Restoring Natural Language as a Computerised Mathematics Input Method, Manuel Maarek, Heriot-Watt University, Edinburgh. Join work with Fairouz Kamareddine, Robert Lamar and Joe Wells.

Methods for computerised mathematics have found little appeal among mathematicians because such methods call for additional skills which are not available to the typical mathematician. We herein propose to reconcile computerised mathematics to mathematicians by restoring natural language as primary medium for mathematical authoring. The method permits the association of portions of a text with their grammatical roles. One can find in any textbook some abbreviations, such as the aggregation of equations a=b>c, which can hardly be represented in most computerised languages directly. We define some specific annotations to explicate the morphology of such natural language style. These annotations are later expanded to obtain a representation of mathematical knowledge suitable for computations (i.e., a=b, b>c). We have named this method syntax souring. All results have been implemented in a prototype editor developed on top of TeXmacs as a GUI for MathLang, a system we have developed to check the type-correctness of these annotations. We applied this method for computerising some typical mathematical texts.

Jeudi 15 mars à 11h00, salle A006
Composing Rewriting-Based Access Control Policies, Anderson Santana de Oliveira.

Term rewriting is a suitable paradigm to provide a formal semantics to flexible access control policies. In the rewriting-based model, policies are implemented by sets of rewrite rules, whose evaluation produces positive or negative authorization decisions. We generalize previous work with the introduction of strategies to support reasoning about policy composition.

Mardi 27 février à 14h00, salle A006
Normalization-by-Evaluation and Dependent Types, Andreas Abel, LMU, Munchen.

A version of Martin-L"of type theory with universes and equality judgments is introduced. The main technical devices to establish its meta-theoretical properties are a normalization-by-evaluation algorithm, a PER model and a logical relation between syntax and semantics. Central properties like the injectivity of dependent function space constructor (a result which is needed to prove the correctness of the type-checking algorithm) are proven with these semantical tools. The method extends to systems with singleton types, thus showing decidability of equality and type-checking for a combination of universes and singleton types for the first time.

Lundi 26 février à 14h00, salle A006
The Rewriting Calculus as a Combinatory Reduction System, Clara Bertolissi, LIF, Marseille. Join work with Claude Kirchner.

In a previous work, the authors showed how the semantics of CRS can be expressed in terms of the RHO. The converse issue is adressed here: rewriting calculus derivations are simulated by Combinatory Reduction Systems derivations. As a consequence of this result, important properties, like standardisation, are deduced for the rewriting calculus.

Jeudi 15 février à 14h00, salle A006
A rewriting calculus for term-labeled multigraphs, Oana Andrei.

In the first part I will present a rewriting calculus for term-labeled multigraphs and show how we tuned it for modeling molecular complexes, biochemical reaction rules and generation of biochemical networks. The advantage of this approach is to get from the rewriting calculus the ability of defining in the same framework transformation rules and strategies to control rule application and network analysis. In the second part I will give an outline of the implementation in TOM of the labeled multigraphs using term graphs with pointers. I will end with an undergoing work on equational matching with full extension, as a generalisation of the matching with extension variables, and its use in defining equational rewriting with full extension.

Mardi 13 février à 15h00, salle A006
More laziness !, François-Régis Sinot, Universidade do Porto.

Launchbury's natural semantics is a tool of choice to reason about call-by-need evaluation. However, modern non-strict functional languages do not use plain call-by-need evaluation: they also use static optimisations like fully lazy lambda-lifting or partial evaluation. To ease reasoning, it would be nice to have all this in a uniform setting. In this talk, we generalise Launchbury's semantics in order to capture more laziness. More precisely, we capture "complete laziness", as coined by Holst and Gomard in 1991, which is slightly more than fully lazy sharing, and closer to on-the-fly needed partial evaluation. We discuss the connections between this and other related concepts, like optimal reduction.

Mardi 13 février à 14h30, salle A006
Transformation de bytecode pour la sécurité, Pauline Kouzmenko, Ecole des Mines de Nancy.

Mardi 6 février à 14h00
Integrating the Text-Editor TeXmacs with the Proof Assistance System Omega using Plato, Serge Autexier.

My overall research goal is to bring formal logic based techniques into practice. The most recent offshoot of that research is the integration of the proof assistant OMEGA into the text editor TEXMACS. More specifically, we present a generic mediator, called PLATO, between text editors and proof assistants and how it served to integrate the proof assistant OMEGA into TEXMACS. The aim of this mediator is to support the development, formalisation and publication of mathematical documents in as naturally as possible. In particular PLATO should allow the user to author his mathematical documents with a scientific WYSIWYG text editor in the informal language he is used to, that is a mixture of natural language and formulas. These documents can be semantically annotated preserving the textual structure by using the flexible parameterised proof language which we present. From this informal semantic representation PLATO automatically builds up the corresponding formal representation for a proof assistant, in our case OMEGA. The most important task of the mediator is the maintenance of consistent formal and informal representations during the further development of the document in interaction with OMEGA. The presentation will include a demo of the resulting system. Finally, as time permits, I will show different lines of my research conducted under the overall goal were joined in this work to make this integration effectively possible.

Les 18,23 et 25 janvier à 14h00
Cours d'Initiation à la Théorie des Catégories, Fancois Lamarche.

Concrètement ce mini-cours va porter essentiellement sur les rapports entre la syntaxe et l'algèbre. Je vais être très élémentaire au début et essayer de donner le maximum d'exemples du va-et-vient qu'on peut faire entre des systèmes de termes et les catégories qui y correspondent. Un aspect important de ma démarche est de donner une idée de "l'esprit catégorique" par rapport aux approches plus concrètes. La théorie des catégories, c'est une théorie générale et abstraite des structures mathématiques. et ici le mot abstraction a le même sens qu'en programmation: c'est un système de boîtes noires, ou d'interfaces, équipé d'une énorme boîte à outils qui permet de les combiner de toutes sortes de façons. Sa raison d'être est l'économie mentale qu'elle permet: on n'ouvre un boîte noire que quand c'est absolument nécessaire. La plupart du temps on peut travailler juste avec l'interface.

09 janvier 2007 à 14h00
Security evaluation and formal verification, Quang-Huy NGUYEN, Security Labs, Gemalto.

Common Criteria (CC) is nowadays widely used for evaluating the security of IT products. The evaluation process (for security engineers) consists in providing elements such as models, proffs, documents, test-cases and the code to fulfill the CC requirements with respect to the security objectives. For the formal verification community, a question is raised: what is really formalized and proved in a (high-level) evaluated software ? On the other hand, an security engineer usually does not know how secure a formally verified software is. We give in this talk some elements to answer these questions using our work on modeling a specific implementation of the Java Card API. We aim at showing the mutually beneficial relation between formal verification and security evaluation activities.

27 novembre 2006 à 16h00
Towards a next-generation ToolBus architecture, Paul Klint.

27 novembre 2006 à 15h15
Timing Analysis and Timing Predictability, Reinhard Wilhelm.

27 novembre 2006 à 11h00
Réécriture et compilation de confiance, Antoine Reilles.

Cette thèse propose de s'intéresser à un ensemble de méthodes basées sur la réécriture et dédiées à l'écriture de compilateurs sûrs et plus généralement, à la certification des transformations de documents complexes. Ce sujet touche plusieurs domaines clés de l'informatique tels la vérification de protocoles, les prouveurs de théorèmes, la certification de compilateurs, la programmation par règles. Le point de départ de ce travail est le compilateur du langage Tom, langage de haut niveau permettant d'introduire des constructions de filtrage dans des langages généralistes : Java, C, Caml.

7 novembre 2006 à 14h00
On the implementation of construction functions for non-free concrete data types, Frédéric Blanqui.

Many algorithms use concrete data types with some additional invariants. The set of values satisfying the invariants is often a set of representatives for the equivalence classes of some equational theory. For instance, a sorted list is a particular representative wrt commutativity. Theories like associativity, neutral element, idempotence, etc. are also very common. Now, when one wants to combine various invariants, it may be difficult to find good representatives and implement and maintain the invariants efficiently. This can be achieved by using an abstract data type or a concrete data type with private constructors and appropriate construction functions. In this paper, we recall the notion of private data type and study the existence of construction functions. We also describe some prototype, called Moca that automatically generates efficient construction functions for the combination of common invariants. Joint work with Thérèse Hardin and Pierre Weiss.

12 octobre 2006 à 11h00
Proving strong normalization for symmetric calculi, Dan Dougherty, WPI

There are a variety of "symmetric" proof-term calculi studied in the recent literature: Barbanera and Berardi's symmetric lambda calculus, Curien and Herbelin's lambda-mu-mu~ calculus, Wadler's dual calculus, and several others. Typically these arise in the study of the Curry-Howard isomorphism for classical logic. Strong normalization holds for typed terms of these systems, but the standard reducibility arguments familiar from ordinary lambda-calculus get stuck, precisely because of the symmetries which are the systems' prime motivation. Barbanera and Berardi invented a clever variation on reducibility, which works for simple types, but it is technically somewhat awkward; in particular it does not seem to apply easily to systems of intersecton types. We present another technique for proving strong normalization which seems to us to be rather natural, gives some additional insight into reduction in these systems, and generalizes immediately from simple types to intersection types.

11 octobre 2006 à 10h00
Policy-Informed Program Analyses, Dan Dougherty, WPI

Access-control policies play a central role in controlling the dissemination of sensitive data. They represent an important but not isolated example of policies or rules that govern the behavior of programs. As these policies grow in complexity, developers increasingly express them in domain-specific, declarative policy languages. These languages induce a variety of analysis problems while affording interesting opportunities. I will discuss some of these problems and opportunities, as well as our results, especially as they apply to our own production software. Joint work with Kathi Fisler, WPI and Shriram Krishnamurthi, Brown University.

10 octobre 2006 à 14h00
Reduciblity Domain, Arnaud Spiwack

U. Berger, significantly simplified Tait's normalisation proof for bar recursion, replacing Tait's introduction of infinite terms by the construction of a domain having the property that a term is strongly normalizing if its semantics is not bottom. The goal of this paper is to show that, using ideas from the theory of intersection types and Martin-Löf's domain interpretation of type theory, we can in turn simplify U. Berger's argument in the construction of such a domain model. We think that our domain model can be used to give modular proofs of strong normalization for various type theory. As an example, we show in some details how it can be used to prove strong normalization for Martin-Löf dependent type theory extended with bar recursion, and with some form of proof-irrelevance.

09 octobre 2006 à 14h00
A Proof Monad, Florent Kirchner

We present a new approach to the formalization of instructions for procedural theorem provers. Using categories and monads, this talk will expose the denotation of a small proof language. A potential classification of instructions will also be examined.

03 octobre 2006 à 14h00
Une introduction aux polygraphes, Yves Guiraud

Les polygraphes fournissent un langage unifié pour décrire des structures algébriques complexes, des systèmes de réécriture, des formalismes logiques, mais aussi des objets issus de la topologie ou de la physique théorique. J'expliquerai certains de ces exemples puis quelques questions qui m'occupent en ce moment, en rapport avec la réécriture, la logique et, peut-être, la programmation.

12 septembre 2006 à 14h00
Consistency of Supernatural Deduction, Clément Houtmann

The Supernatural Deduction and the Extensible Sequent Calculus are two new deduction systems introduced respectively by Benjamin Wack and Paul Brauner. In order to ease the use of first order theories, new inference rules are added to the Natural Deduction or to the Sequent Calculus. I will propose then several methods to prove the consistency of these new systems, which is equivalent to the consistency of the underlying first order theory.