# Program of the Workshop

## Invited Talks

### Richard Garner: *Topological and higher-dimensional aspects of type theory *

It was observed by Hofmann and Streicher in the 1990s that,
in Martin-Löf's intensional type theory, it is possible to
endow each type *A* with a groupoid structure, by taking as
the ``type of morphisms'' from *a* to *b*, the identity type
*Id _{A}(a,b)*. However, it was made clear even at this
time that this was not the end of the story. Indeed, the
associativity and unitality axioms for this type-theoretic
groupoid do not hold on-the-nose, but only up to
propositional equality. Thus, if we are to obtain a genuine
groupoid from a type, we must first quotient out by
propositional equality.
A similar phenomenon occurs in the construction of the
fundamental group of a topological space: in order that the
group operation should be associative on-the-nose, one must
quotient out paths by the homotopy relation. This suggests
the following analogy: that types are like topological
spaces, and that propositional equality is like the homotopy
relation. Remarkably, it turns out that this analogy can be
made into something precise.
The purpose of this talk is to give an overview of the
growing body of work which exploits this connection, and to
describe some of the challenges that remain.

### Pawel Urzyczyn : *Decision problems in polymorphic type assignment*

This talk will survey known results (some old ones and some new ones) and open questions concerning decision problems related to various type-assignment systems. These decision problems fall into essentially two categories:

- Typability and type-checking: Given a term M, determine if M can be assigned a given type (or any type at all) in a given (or arbitrary) type environment.
- Type inhabitation (provability): Given a type t, determine if there is a term M such that M has type t (or equivalently, if t has a proof) in a given type environment.

For the simply typed lambda-calculus, these problems are decidable in polynomial time (type inference) or space (inhabitation). In polymorphic calculi each of them can be undecidable; in particular the famous theorem of Wells states that typability in system F is undecidable. Little is known about decidable subsystems (or variants) of F, for example it is an open problem if typability is decidable in the parametric fragment. In addition to the universal polymorphism of system F and its relatives, this survey will cover the decision problems originating from Curry-style, Church-style and domain-free calculi involving other forms of polymorphism: existential, ad hoc (intersection types) and first-order (dependent types).

### Peter Hancock : *Entering bars infinitely often*

Infinite streams have a natural topology. I'll present a complete representation of continuous functions on streams by data-structures that behave like programs for reading streams and writing values or streams of values. To ensure continuity requires both coinductive and inductive definitions, that on the inductive side, go back to Brouwer's bar-induction. I'll sketch the extension of these ideas to kinds of infinite object more general than streams, where the use of dependent types seems essential.

Joint work with Neil Ghani (Strathclyde) and Dirk Pattinson (Imperial).

## Schedule — hide all abstracts

- May the 11th
- 19:00 —
*Cold dinner*

- 19:00 —
- May the 12th
- 09:00 —
*Registration* - 12:30 —
*Lunch* - 14:00 — Bruno Barras — slides
*Extending the Implicit Calculus of Constructions with Union and Subset Types*

Abstract: This is joint work with Bruno Bernardo and Alexandre Miquel. In previous work, we have proposed a variant of the Implicit Calculus of Constructions (ICC) called ICC* that provides a nice framework for software verification. Unlike ICC, ICC* enjoys decidability of type-checking, by keeping the implicit information in proof-terms. The most interesting feature of this system is that convertibility is performed on extracted terms (ICC* terms translated to ICC terms by erasing implicit subterms), thereby adding significant flexibility for proving programs correct. In this talk, we describe how this core calculus can be extended with Sigma-types, which can be considered as a first step towards adding full-fledged inductive types. We introduce two new constructions: union types and subset types, which can be viewed as implicit Sigma-types where (respectively) the first or the second projection is implicit. But since Tatsuta\'s counter-example about implicit existential apply, union types break subject-reduction. However, we show that the extension of ICC* with such constructions enjoys better syntactic properties (including subject-reduction), while capturing most of the semantics of ICC with Sigma-types.- 14:30 — Chantal Keller — slides
*Importing HOL-Light into Coq : deep and shallow embeddings of the higher order logic into Coq*

Abstract: We present a work in progress that aims at realizing an interface to import HOL-Light theorems into Coq. We propose a deep embedding of the higher order logic into Coq, with data-types representing types, terms and proofs. A translation function lifts these propositions and proofs into their Coq counterparts and is proved correct. In the long run, the main challenge is to have a good representation of HOL data-types, with as compact as possible proofs and a rather fast translation.- 15:00 — Lionel Vaux — slides
*Primitive recursion in finiteness spaces*

Abstract: The category FIN of finiteness spaces and finitary relations introduced by Ehrhard provides a model of linear logic, which refines the purely relational one. By the usual co-Kleisli construction, it also provides a model FIN! of the simply typed lambda-calculus. Although it is known that the standard fix-point operators are not finitary, Ehrhard proved that a limited form of recursion was available, by defining a finitary tail-recursive iteration operator. We prove that finiteness spaces can actually accomodate the standard notion of primitive recursion: the cartesian closed category FIN! admits a weak natural number object, in the sense of Thibault (82) and Lambek-Scott (88), which allows to model the iterator version of Gödel\'s system T. We also exhibit a recursor operator using the same object. This work is carried out in the qualitative version of finiteness spaces, but it should be understood as a first step towards a quantitative semantics of primitive recursion in the algebraic version. With this goal in mind, we will discuss the relevance of our approach.- 15:30 —
*Coffee Break* - 16:00 — Cody Roux — slides
*Normalisation by set theoretic models and labelling*

Abstract: The strong normalisation of the system T is (usually) proven by a short but complex proof. It involves building a type of realisability model. We show that this proof can be factored into three parts: the construction of a simple set theoretic model, the construction of a new system T\' whose normalisation is equivalent to the one of system T, and the normalisation proof of T\', which uses a realisability model, but is generic and of low proof complexity. We show that this method can be used to generalize a number of termination methods.- 16:30 — Neil Sculthorpe — slides
*Safe Functional Reactive Programming through Dependent Types*

Abstract: Functional Reactive Programming (FRP) is an approach to reactive programming where systems are structured as networks of functions operating on signals. FRP is based on the synchronous data-flow paradigm and supports both continuous-time and discrete-time signals (hybrid systems). What sets FRP apart from most other languages for similar applications is its support for systems with dynamic structure and for higher-order reactive constructs. Statically guaranteeing correctness properties of programs is an attractive proposition. This is true in particular for typical application domains for reactive programming such as embedded systems. To that end, many existing reactive languages have type systems or other static checks that guarantee domain-specific properties, such as feedback loops always being well-formed. However, they are limited in their capabilities to support dynamism and higher-order data-flow compared with FRP. Thus, the onus of ensuring such properties of FRP programs has so far been on the programmer as established static techniques do not suffice. We address this concern by embedding an implementation of FRP in Agda, leveraging the Agda type system to craft a domain-specific (dependent) type system for FRP. As the implementation passes the Agda type, coverage, and termination checks, we know our type system is safe.- 17:00 — Denis Cousineau — slides
*How to prove strong normalization from weak normalization, the case of minimal deduction modulo*

Abstract: Deduction modulo is an extension of first-order predicate logic where axioms are replaced by rewrite rules and where many theories, such as arithmetic, simple type theory and some variants of set theory, can be expressed. An important question in deduction modulo is to find a condition of the theories that have the strong normalization property. In a previous paper we proposed a refinement of the notion of model for theories expressed in deduction modulo, in a way allowing not only to prove soundness, but also completeness: a theory has the strong normalization property if and only if it has a model of this form. In this talk, we will present how we can use these techniques to prove that all weakly normalizing theories expressed in minimal deduction modulo, are strongly normalizing.- 17:45 —
*Demo session* - 19:00 —
*Dinner*

- 09:00 —
- May the 13th
- 09:00 — Pawel Urzyczyn (invited) — slides
*Decision problems in polymorphic type assignment*

Abstract: This talk will survey known results (some old ones and some new ones) and open questions concerning decision problems related to various type-assignment systems. These decision problems fall into essentially two categories: * Typability and type-checking: Given a term M, determine if M can be assigned a given type (or any type at all) in a given (or arbitrary) type environment. * Type inhabitation (provability): Given a type t, determine if there is a term M such that M has type t (or equivalently, if t has a proof) in a given type environment. For the simply typed lambda-calculus, these problems are decidable in polynomial time (type inference) or space (inhabitation). In polymorphic calculi each of them can be undecidable; in particular the famous theorem of Wells states that typability in system F is undecidable. Little is known about decidable subsystems (or variants) of F, for example it is an open problem if typability is decidable in the parametric fragment. In addition to the universal polymorphism of system F and its relatives, this survey will cover the decision problems originating from Curry-style, Church-style and domain-free calculi involving other forms of polymorphism: existential, ad hoc (intersection types) and first-order (dependent types).- 10:00 — Johannes Kanig — slides
*Proof of Higher-Order Programs using Effect Polymorphism*

Abstract: We present a weakest precondition calculus for an imperative, higher-order programming language, restricted to non-aliasing global references. The main feature of the calculus is effect polymorphism, i.e., higher-order functions can be specified by generalizing over the effects and specifications of their functional arguments. This allows truly modular reasoning about effectful higher-order functions. We achieve this by defining an extension of a standard higher-order logic, more suitable to reason about modifications of the store. The obtained proof obligations, however, are expressible in the standard logic and can be proved using standard theorem provers. We show that the necessary annotations and the obtained proof obligations are quite natural.- 10:30 —
*Coffee Break* - 11:00 — Jean Duprat — slides
*Plane geometry for pupils in Coq*

Abstract: The Euclidian geometry is formalized by the given of two relations : the orientation between three points and the equidistance between four points, and the given of three constructors: the ruler for lines, the compass for circles and the intersection for points. Then, we defined distances and angles, we build triangles, midpoints, midlines, parallel and perpendicular lines. A distinction, using the Set and Prop sorts of Coq, between figures and properties is done. The goal is to give to pupils a tool for writing proofs in geometry beside a figure drawn with a dynamic editor like GeoGebra or Cabri-Geometre.- 11:30 — Enrico Tassi — slides
*Hints in unification*

Abstract: Several mechanisms such as Canonical Structures, Type Classes, or Pullbacks have been recently introduced with the aim to improve the power and flexibility of the type inference algorithm for interactive theorem provers. We claim that all these mechanisms are particular instances of a simpler and more general technique, just consisting in providing suitable hints to the unification procedure underlying type inference. This allows a simple, modular and not intrusive implementation of all the above mentioned techniques, opening at the same time innovative and unexpected perspectives on its possible applications.- 12:00 — Claudio Sacerdoti Coen — slides
*Matita NG: reduction and type-checking*

Abstract: Matita aims to be the simplest implementation of (a variant) of the Calculus of (Co)Inductive Construction. The system is currently under complete re-implementation. In this talk we present the main changes to the calculus and to the implementation of the kernel (reduction and type-checking).- 12:30 —
*Lunch* - 14:00 — Andreas Abel — slides
*Normalization by Evaluation for Impredicative Type Theory (Work in Progress)*

Abstract: Normalization by Evaluation (NbE) is an abstract framework for computing the full normal form of lambda-terms through an interpreter, just-in-time compiler or an abstract machine. Currently Coq supports normalization through computational rules (beta-delta-iota), eta equality is not supported. In this talk, I present typed NbE which computes eta-long normal forms, and show how to construct a model of (possibly impredicative) type theory that proves the correctness of NbE. Hence, NbE can be used to decide the built-in (\"definitional\") equality of type theory with eta-rules. The aim of this work is to provide foundational justifications of powerful type theories with beta-eta equality, such as the Calculus of Inductive Constructions.- 14:30 — Paola Toto — slides
*Algebraic setting for many-valued constructive topology*

Abstract: j.w.w. Maria Emilia Maietti and Giovanni Sambin (University of Padua) Sambin in his forthcoming book introduces a new theory, called “The Basic Picture”, to put Topology in constructive and predicative terms. Two key structures are needed to develop Sambin’s theory and to reach his purpose: set-based lattices and overlap algebras. We want to show how these two definitions are independent from prop, if we consider a suitable evaluation of them and in so doing to provide their many-valued version. We prove that considering an overlap algebra instead of prop, as the underlying structure of truth values, the original algebrization of Sambin’s topological notions is still working.- 15:00 — Ugo de Liguoro — slides
*An interactive concept of integer and boolean to model classical arithmetic constructively*

Abstract: (joint work with S. Berardi) We propose a realizability interpretation of a system for quantiﬁer free arithmetic which is equivalent to the fragment of classical arithmetic without nested quantiﬁers, which we call EM1-arithmetic. We interpret classical proofs as interactive learning strategies, namely as processes going through several stages of knowledge and learning by interacting with the “environment” and with each other. In this talk we give a categorical presentation of the model through the construction of a suitable monad.- 15:30 —
*Coffee Break* - 16:00 — Randy Pollack — slides
*A new canonical representation of binding*

Abstract: By Masahiko Sato and Randy Pollack We refine old work of McKinna and Pollack which provides a name-carrying representation of binding s.t. correctness of terms is a predicate on a datatype of raw expressions. What is new is that alpha-conversion is identity. The representation is \"lightweight\", and can be formalised in intensional constructive type theory, as well as in HOL.- 16:30 — Tom Hirschowitz — slides
*Contraction-free proofs and games for Linear Logic*

Abstract: In the standard sequent presentations of Girard\'s Linear Logic (LL), there are two \"non-decreasing\" rules, where the premises are not smaller than the conclusion, namely the cut and the contraction rules. It is a universal concern to eliminate the cut rule. We show that, using an admissible modification of the tensor rule, contractions can be eliminated, and that cuts can be simultaneously limited to a single initial occurrence. This view leads us to a consistent game model for LL with exponentials, where each play is finite. However, this model somehow collapses the whole exponential structure, e.g., it validates A -o !A. We then define another game, with an associated set of inference rules, which is again a consistent model of LL with finite plays, but rectifies the above deficiency. Exponentials in this game follow the standard decomposition of !A as an infinitary product of tensorial powers of A.- 17:00 — Matthieu Sozeau — slides
*A New Look at Generalized Rewriting in Type Theory*

Abstract: Rewriting is an essential tool for computer-based reasoning, both automated and assisted. In a proof assistant, rewriting can be used to replace terms in arbitrary contexts, generalizing the usual equational reasoning to reasoning modulo arbitrary relations. This can be done provided the necessary proofs that functions appearing in goals are congruent with respect to specific relations. We present a new implementation of generalized rewriting in the Coq proof assistant, making essential use of the expressive power of dependent types and a recently implemented type class mechanism. The tactic improves on and generalizes previous versions by supporting natively higher-order functions (rewriting under binders), polymorphism and subrelations. We demonstrate how the type class system inspired from Haskell can provide a natural interface between the user and such tactics, making them easily extensible.- 17:45 —
*Demo session* - 19:00 —
*Dinner*

- May the 14th
- 09:00 — Peter Hancock (invited) — slides
*Entering bars infinitely often*

Abstract: Infinite streams have a natural topology. I\'ll present a complete representation of continuous functions on streams by data-structures that behave like programs for reading streams and writing values or streams of values. To ensure continuity requires both coinductive and inductive definitions, that on the inductive side, go back to Brouwer\'s bar-induction. I\'ll sketch the extension of these ideas to kinds of infinite object more general than streams, where the use of dependent types seems essential. Joint work with Neil Ghani (Strathclyde) and Dirk Pattinson (Imperial).- 10:00 — Ioana Pasca — slides
*Formal verification of exact computations using Newton\'s method*

Abstract: We are interested in the certification of Newton\'s method. We use a formalization of the convergence and stability of the method done with the axiomatic real numbers of Coq\'s Standard Library in order to validate the computation with Newton\'s method done with a library of exact real arithmetic based on co-inductive streams. The contribution of this work is twofold. Firstly, based on Newton\'s method, we design and prove correct an algorithm on streams for computing the root of a real function in a lazy manner. Secondly, we prove that rounding at each step in Newton\'s method still yields a convergent process with an accurate correlation between the precision of the input and that of the result. An algorithm including rounding turns out to be much more efficient.- 10:30 —
*Coffee Break* - 11:00 — Thorsten Altenkirch — slides
*PiSigma a core language for dependently typed programming*

Abstract: PiSigma is a core language, hence the emphasis is on small leaving out lots of syntactic sugar which makes languages like Coq’s CIC, Agda, Epigram or Cayenne actually usable. The idea is that these features can be recovered by a syntactic translation (or elaboration as we say) as part of the compiler. Having a small core language is nice for compiler writers but also for theoretical analysis. The aim here is similar as Fc for Haskell but directed at dependent functional programming.- 11:30 — Hugo Herbelin — slides
*Investigations into Kripke models for classical logic*

Abstract: (joint work with Danko Ilik and Gyesik Lee) We study notions of Kripke models with exploding nodes that characterise in a direct way the underlying semantics of classical logic obtained via double-negation translation. Going back to the ordinary intuitionistic semantics, we emphasise that the negative fragment of logic is interpreted along a call-by-name semantics. As for the positive fragment, its relation with call-by-value semantics is discussed. Combining soundness and completeness for classical Kripke models provides with a normalisation by evaluation for classical predicate logic. The call-by-name evaluation procedure has been formalised in Coq.- 12:00 — Michael Lienhardt — slides
*Typing Component-Based Communication Systems*

Abstract: We present Dream Types, a type system for component-based communication systems, used to detect message manipulation errors in component assemblages which may present complex data flows due to presence of routers and multiplexers. This type system enjoy full type inference, and has been implemented on two component frameworks, Dream and Click. This work was done together with Alan Schmitt and Jean-Bernard Stefani.- 12:30 —
*Lunch* - 14:00 —
*Easy mountain walk* - 17:00 —
*Business meeting* - 19:00 —
*Dinner*

- May the 15th
- 09:00 — Richard Garner (invited) — slides
*Topological and higher-dimensional aspects of type theory*

Abstract: It was observed by Hofmann and Streicher in the 1990s that, in Martin-Löf's intensional type theory, it is possible to endow each type A with a groupoid structure, by taking as the ``type of morphisms'' from a to b, the identity type IdA(a,b). However, it was made clear even at this time that this was not the end of the story. Indeed, the associativity and unitality axioms for this type-theoretic groupoid do not hold on-the-nose, but only up to propositional equality. Thus, if we are to obtain a genuine groupoid from a type, we must first quotient out by propositional equality. A similar phenomenon occurs in the construction of the fundamental group of a topological space: in order that the group operation should be associative on-the-nose, one must quotient out paths by the homotopy relation. This suggests the following analogy: that types are like topological spaces, and that propositional equality is like the homotopy relation. Remarkably, it turns out that this analogy can be made into something precise. The purpose of this talk is to give an overview of the growing body of work which exploits this connection, and to describe some of the challenges that remain.- 10:00 — Zhaohui Luo — slides
*Dependent Record Types Revisited*

Abstract: Dependently-typed records have been studied in type theory in several previous research attempts, with applications to the study of module mechanisms for both programming and proof languages. In this talk, arguing that the previous formulations have either limitation or weakness, I present an improved formulation of dependent record types, compare it with previous ones and consider applications. In particular, we show that record types (but not record kinds) can be used to express module types in data refinement and that record types (but not product types) can be combined with structural subtyping to faithfully represent dot-types in linguistic semantics.- 10:30 —
*Coffee Break* - 11:00 — Nils Anders Danielsson — slides
*Mixing Induction and Coinduction*

Abstract: Few people seem to be aware of the utility of mixing induction and coinduction in the same definition, as in νX.μY.…. I will discuss some types which are defined using this technique. The common theme is that, while values of these types can be \"infinite\", the infinity is not unrestricted. For instance, when defining subtyping for recursive types it is natural to let the arrow rule be coinductive, reflecting the fact that recursive types can (conceptually) be infinite. However, the transitivity rule should be inductive, because a coinductive reading of transitivity does not make sense.- 11:30 — Clement Houtmann — slides
*Cut-elimination Canonicity for Classical Sequent Calculus*

Abstract: Classical sequent calculi are deduction systems that present many advantages: usable, extendible, adaptable into a wide range of automatic proof search procedures, etc. These systems also are of interest for programmers through formulae-as-types analogies such as Griffin\'s correspondence between classical logic and continuation-passing style programming. In formal logic, they are the foundation of a large number of existing extensions (focusing, tableaux, inverse method, definitional reflection, deduction modulo, superdeduction) which already provide substantial advances to the deduction machinery. As we wish to propose new improvements, a careful study of the classical sequent calculus paradigm is crucial. Nevertheless the flexibility of classical sequent calculi also echoes on the wide range of proofterm languages and cut-elimination procedures ---Herbelin\'s lambda bar mu mu tilde-calculus or Urban\'s cut-elimination for instance--- that one can imagine for such systems. Careless decisions here can easily lead to the loss of important properties such as confluence or strong normalisation. In this talk we propose a new proofterm language for the classical sequent calculus which aims at containing only the necessary informations for rebuilding a proof. We present a cut-elimination procedure and recover both strong normalisation and confluence without using any restricting reduction strategy. Finally we demonstrate how this work leads to the definition of enhanced frameworks designed for logicians as well as programmers and based on classical sequent calculi.- 12:00 — Dulma Rodriguez — slides
*Efficient Type-Checking for Amortised Heap-Space Analysis*

Abstract: The prediction of resource consumption in programs has gained interest in the last years. It is important for a number of areas, notably embedded systems and safety critical systems. Different approaches to achieve bounded resource consumption have been analysed. One of them, based on the combination of amortized complexity analysis with type-based techniques, has been studied by Hofmann and Jost in 2006 for a Java-like language. In this talk we describe efficient automated type-checking for a finite, annotated version of that system. We prove soundness and completeness of the type checking algorithm and show its efficiency.- 12:30 —
*Lunch* - 14:00 — Damien Pous — slides
*Typing and Untyping Various Algebras*

Abstract: Algebras sometimes need to be typed. For example, matrices over a ring form a ring, but the product is a only a partial operation: dimensions have to agree. Therefore, an easy way to look at matrices algebraically is to consider \"typed rings\". We will show some untyping theorems: in some algebras (rings, Kleene algebras, residuated semi-rings), types can be erased/recovered in valid equalities. As a consequence, the corresponding decision procedures can easily be extended to the typed setting.- 14:30 — Francois Garillot — slides
*Working effectively with generic subgroups*

Abstract: Group algebra developments frequently involve studying the properties of some generic subgroup in a specific context. However, those objects all share some structural properties that, to our knowledge, the mathematical literature still treats in an ad hoc manner. Our participation in an ongoing, large-scale formalization of finite group algebra using the Coq theorem prover drove us to proceed more efficiently. Using a hierarchy of canonical structures, we were able to relate the definition of these subgroups as polymorphic functions with their functoriality, and exposed the group-theoretic properties verified by classes of such subgroups. Our contribution thus presents an example where our intuition about \"free theorems\" on Coq terms helps us structure proofs on the mathematical objects they represent.- 15:00 —
*Coffee Break* - 15:30 — Micaela Mayero — slides
*Certification of Numerical Analysis Programs*

Abstract: I will present you a project which aims at developing and applying methods which allow us to formally prove the soundness of programs from numerical analysis. In particular, we are interested in programs which often appear in the resolution of critical problems. Many critical programs come from numerical analysis, but few people have ever tried to apply formal methods to this kind of programs. The main reason is that real numbers (floating-point) are greatly used in numerical programs, whereas formal methods tend to handle integers, or more generally discrete structures. I will present our experimentation on a case study. It consists in proving the correctness of a 1 dimension wave equation program (written in C). We have to check on 2 kinds of errors : method error and floating-point error. To deal with the first one, it is necessary to formalise the problem (in fact the algorithm) and the error in a proof system (in our case, we use Coq). To deal with the second one, we have to work on the C program itself and we use Caduceus. The C program has to be annoted w.r.t. the properties we want to be preserved. These annotations generate some proof obligations for several proof systems. The methods developed in this project may be applied to a large variety of problems in environments intended to produce numerical code, which is safe and correct w.r.t. its formal specification.- 16:00 — Barbara Petit — slides
*A lambda calculus with pattern matching and polymorphims*

Abstract: The lambda-calculus with constructor of Arbiser, Miquel and Rios is an extension of lambda-calculus that carries out a pattern matching on variadic constructors mechanism. The main specificity of this calculus is to decompose the ML-style pattern matching into a case (switch) analysis on constants, and a -quite surprising- commutation rule. We propose a polymorphic type system for this calculus, with unions and intersections, and we prove its strong normalisation using the standard reducibility method extended to data structures.

## Demos

- Humayoun Muhammad
*Mathematical Proofs in Natural Languages (MathNat)*

Abstract:*MathNat*is a project for developing tools to enable automatic formalisation of mathematical text.

We proceed in two steps:*Syntax:*We develop a controlled language for mathematical text, mainly proofs by using a type theoretic approach. This is a subset of natural language.*Semantics:*We translate this controlled language to an abstract mathematical language (MathAbs). MathAbs is a system independent formal language for writing mathematics. We also solve some basic anaphora in this step.

- Vincent Laporte
*Coq-GeoGebra interface*

Abstract: This is a tool whose aim is to build and display a dynamic figure while a theorem of Euclidian geometry is proved. Based on Jean Duprat\'s formalization of this geometry in Coq, this tool is able to construct a figure from a proof context. Thanks to GeoGebra capabilities, the user is allowed to change the figure him or herself. Therefore this tool is also able to check a figure against the current context. Besides the figure evolves accordingly to the changes occurring in the proof context as the proof goes on. This interface has been designed so as to be extensible: it may easily take into account the modifications of the formalization. Moreover, its use might not be limited to this particular formalization of plane geometry.- Christophe Raffalli
*PML*

Abstract: I shall present the current status of PML development. Notably, the termination test implemented by Pierre Hyvernat.

## Social Events

- The volunteers may have an easy mountain walk in the afternoon on the 14th
- The dinner on the 14th will have some regional dishes.