They are all closely related, and some but not all are precisely equivalent. I will survey the relationships between them, and their implications for modelling type theory. This is in part a report on work in progress with Benedikt Ahrens and Vladimir Voevodsky, in which we formalise some of the comparisons in question.
Abstract: Constructive mathematics is often equated with computation. But perhaps a better view compatible with computation is that constructive mathematics is about information: information we give, and information is offered to us. In classical mathematics what matters about a theorem is its truth, whereas in constructive mathematics what matters is how much information its formulation and proof give.
A particular case of existence is disjunction. But both classical and constructive mathematicians still want to know which one is the case. An answer to the second question gives more information. In constructive systems such as type theory, it is possible to express mathematically the difference between the information content of the first and second situations. In this talk I want analyse, mathematically rather than meta-mathematically or philosophically , the notion of existence in terms of information content. I will perform this mathematical analysis in Martin-Loef type theory and some univalent extensions.
March 9 Abstract: Strong collection and subset collection are two axioms of constructive set theory. They are distinct from other set theoretical axioms in that they claim existence of certain sets, but do not characterise these sets precisely. I will in this talk discuss these axioms in the context of a model where the interpretation of equality is the identity type.
As it turns out, it is instructive to first consider the equivalents of these axioms for multisets. February 24 Abstract: Every dependent first-order signature Sigma generates a free category with families F Sigma. Such a morphism is uniquely determined by its values on the signature. We show that such morphisms can constructed by incrementally by induction on the signature. This shows that this functorial notion of model extends the usual non-dependent version of model of first-order signature. February 17 The extended TTAR has potentials for more efficient computational semantics of formal and natural languages.
The canonical forms of the terms determine the algorithms that compute their semantic denotations, and, in addition, the relation of algorithmically referential synonymy between TTAR-terms. The lambda-rule, which is the most important rule of the reduction calculus of TTAR, strictly preserves the algorithmic structure of the reduced terms. However, in its original, general formulation, the lambda-rule may result in superfluous lambda abstractions in some parts of the terms.
In the second part of the talk, I introduce the gamma-rule, which extends the reduction calculus of TTAR and its referential synonymy to gamma-reduction calculus and gamma-synonymy, respectively. The gamma-rule is useful for simplifying terms. It reduces superfluous lambda-abstraction and corresponding functional applications in the terms, while retaining the major algorithmic structure of the terms.
February 3 Abstract: We review basics of dynamic-epistemic logics for agent attitudes of knowledge and belief, plus stepwise effects of events leading to information and belief change. Finally, we discuss a total logical picture of how information functions. References: J. January 20 The two interpretations of natural deduction: how do they fit together? FOLDS equivalences as the basis of a general theory of identity for concepts in higher dimensional categories, part 2. FOLDS equivalences as the basis of a general theory of identity for concepts in higher dimensional categories, part 1.
Abstract for part 1 and 2 : 1. The theory of abstract sets is based on a formal language that is given within FOLDS, first-order logic with dependent sorts. Abstract set theory was introduced in an informal manner by F. Lawvere used his informal, but very informative description of abstract set theory for the purposes of topos theory. In , and then in a more detailed manner in , I introduced the formalization I will talk about here. It is more directly set-theoretical than topos theory, which is based on the language of categories.
The main result of my work is that in the new formalization, the structuralist imperative becomes a provable general fact, in the form that any concept formulated in the dependent-typed language of abstract sets is invariant under isomorphism. I will use the opportunity of abstract sets to introduce, albeit only in an informal way, the general syntax of FOLDS. The syntax is explained both symbolic-logically, and by using tools of categorical logic. My main application of FOLDS is a statement of the universe of the so-called multitopic categories; see The Multitopic Category of All Multitopic Categories on my web-site, in both the and the corrected versions.
In the talk, I will necessarily be rather informal about the subject, with suggestive examples rather than precise general definitions. November 25 Abstract: We study the extent to which logical consequence relations in a language L determine the meaning of the logical constants of L. Carnap asked this question in about the connectives in classical propositional logic and implicitly answered it. In , we extended this result to first-order logic, and to the framework of possible worlds semantics. In this talk, which is about work in progress, I consider the same question for some intensional logics.
First, there are two simple observations on the interpretation of the connectives in intuitionistic propositional logic, in the setting of Kripke semantics. Then I focus on modal logic. Some results and some open issues will be presented. This is joint work with Denis Bonnay. Bonnay and D. November 18 Abstract: The original Bezem-Coquand-Huber cubical set model promised to give a constructive model of homotopy type theory. However, in its original form there was a notable shortcoming: one of the definitional equalities usually included in type theory, the J computation rule was absent.
One way to fix this is to use an alternative definition of identity type in which we keep track more carefully of degenerate paths. The new identity type has a nice presentation in the setting of algebraic model structures. To model identity types what we need is very good path objects: a factorisation of each diagonal as a trivial cofibration followed by a fibration.
In this way we get a constructive proof that cubical sets model intensional type theory including all definitional equalities, dealing with coherence issues directly without using universes or local universes and retaining any propositions from the original BCH model, including univalence. October 28 Abstract: The realization that dependent type theory has a higher categorical interpretation where types are coherent higher groupoids has been a major development in the field. However, while internally types exhibit coherent structure provided by their identity types, type theory itself still lacks any mechanism for describing more general coherent objects.
In particular, it has no means of describing higher equivalence relations i. Much work has focused on the search for reasonable internal notion of simplicial types to remedy this problem, but to date no simple solution has been found. In this talk, motivated by recent progress in formalizing the Baez-Dolan opetopic definition of higher categories in type theory, I will present an alternative approach to coherence based on opetopes.
The theory of opetopes is based on the theory of polynomial functors, also known as as W-types or containers in the type theory literature, and thus fits naturally with concepts already familiar from logic and computer science. I will sketch a type theory extended with syntax for manipulating opetopic expressions a explain how this leads to a quite general logical theory of coherence. October 14 I will also lay out some more abstract nonsense about categories of contextual categories, and use this to give applications of the span-equivalences model to the "homotopy theory of homotopy type theory".
October 7 Abstract: joint work with Chris Kapulkin I will present a new model of type theory in "span-equivalences" over a given model, based on the model in spans or basic pairs as given by Simone Tonelli and by Mike Shulman. This "span-equivalences" model provides a useful technical tool for reasoning about equivalence between models of type theories.
I will describe at least two applications: firstly, that once the interpretation of contexts, types, and terms has been fixed, the interpretation of the rest of type theory is determined up to equivalence; and secondly, a presentation of the "homotopy theory of models of type theory". This is work in progress, and suggestions of further applications are very welcome! September 16 Note unusual location!
In this talk I will present a logical framework for multi-agent visual-epistemic reasoning, where each agent receives visual information from the environment via mobile camera with a given angle of vision in the plane. I will introduce suitable logical languages for formalising such reasoning, involving atomic formulae stating what agents can see, multi-agent epistemic operators, as well as dynamic operators reflecting the ability of agents or, their cameras to move and turn around. I will then introduce several different types of models for these languages and will discuss their expressiveness and some essential validities.
Lastly, I will discuss some basic model-theoretic problems arising in this framework that open up new directions of study, relating logic, geometry and graph theory. This talk is partly based on a recent joint work with Olivier Gasquet Univ. September 9 Abstract: I will spend the first part of the talk on an introduction to interval analysis for general maths audience. In the second part of the talk I will introduce an arithmetic with planar binary trees that are used to represent maps and show how this is used to solve some concrete real-world problems, including nonparametric density estimation, dynamic air-traffic representation, tighter range enclosures of interval-valued functions that can be expressed via finitely many real arithmetic operations, and simulation from challenging densities in up to 10 dimensions.
I September 2 Progress report on formalizing categories with attributes in type theory, part 2. Abstract: This talk is a follow up of the talks by Peter Lumsdaine and myself respectively, last term on this subject. A complete formalization in Coq of bounded categories with attributes and structures for type theory with one universe, has since then been achieved.
We indicate how a setoid model of type theory is constructed via the use of a formalized model of CZF. August 26 Constructive provability versus uniform provability in classical computable mathematics. Abstract: So-called elementary analysis EL is two-sorted intuitionistic arithmetic, which serves as system to formalize constructive mathematics. It is remarkable that all of our proofs are constructive, namely, they are just explicit syntactic translations.
Friday, July 3 Motivations include constructive reverse mathematics and eventually also a weak homotopy type theory. Spring June 3 Progress report on formalizing categories with attributes in type theory. May 27 Abstract: Let T be a dependent type theory admitting the rules for Pi-, Sigma-, and Id- types, and let C T be its syntactic category, that is, the category whose objects are contexts and whose morphisms are sequences of terms.
This category is naturally equipped with a class of syntactically defined weak equivalences and as such presents some quasicategory. It is then natural to ask what can we say about this quasicategory. After explaining the necessary background, I will show a proof that the quasicategories arising this way are locally cartesian closed.
May 20 Abstract: In this talk I will present the second half of my thesis, which relies on classical assumptions and is meant to be a counterpart for the constructive content of the first half. We will define formal systems for infinitary intuitionistic logics and prove completeness theorems in terms of a natural infinitary Kripke semantics, both in the propositional case and the first-order case.
The metatheory will be ZFC plus the existence of weakly compact cardinals, a large cardinal assumption that will also be proved to be necessary for the completeness results to hold. We will also review some applications and consequences of these results. May 6 Abstract: Two notable conservativity results in dependent type theory are due to Martin Hofmann: the conservativity of the logical framework presentation of a type theory, and the conservativity of extensional type theory over intensional type theory with extensionality axioms.
I will discuss these two theorems, and the general status of conservativity results in dependent type theory. This will be an entirely expository talk. April 29 Abstract: In this work-in-progress talk, I will analyse the cubical model of homotopy type theory of Coquand et al. The basic category of cubical sets used is presheaves on the free cartesian category on a bipointed object, i. The presheaf category is the classifying topos for strictly bipointed objects. April 22 Since then the consistency of NF has been an open problem. And the problem remains open today.
But there has been considerable progress in our understanding of the problem. But the working manuscripts available on his web page that describe his possible proofs are not easy to understand - at least not by me. April 15 Abstract: Advances in paraconsistent logics have begun attracting the attention of the mathematics community. Motivations for the development of these logics are wide-ranging: expressiveness of language; a more principled approach to implication; robustness of formal theories in the face of local contradiction; founding naive intuitions; and more.
In this accessible survey talk, we outline what paraconsistency is, why one might use it to do mathematics, discuss some of the triumphs of and challenges to doing mathematics paraconsistently, and present some recent results in both foundations and applications. Applying such logics within mathematics gives insight into the nature of proof, teases apart some subtleties that are not often recognized, and gives new responses to old problems. It will turn out that despite the relative weakness of these logics, long chains of mathematical reasoning can be carried out.
Moreover, in handling contradictions more carefully, paraconsistent proofs often bring out subtle differences between proofs that are often overlooked - not all proofs are created equal, even when truth or falsity! March 11 Abstract: We give an analysis of some long-established constructive completeness results in terms of categorical logic and pre-sheaf and sheaf semantics, and add some new results in this area which flow from the analysis.
Several completeness theorems are derived without the assumption that equality of non-logical symbols in the language can be decided. February 25 Progress report: formalising semantics of type theory in type theory. Abstract: In recent work with Eyvind Briseid and Pavol Safarik we defined functional interpretations for systems for nonstandard arithmetic.
We used these interpretations to prove conservation and term extraction results. In the talk I will explain the nonstandard functional interpretation could have been found without aiming for a proof-theoretic analysis of nonstandard systems, but rather by modifying certain ideas for refined term extraction.
I will also discuss recent developments. January 14 Douglas S. Abstract: In the northern autumn of , I came across A. Such a development would seem to be particularly suitable for proof-checking and for the extraction of programs from proofs. Chapter 1 of my D. After that, despite a brief foray into CMST for a conference paper in , my plan to develop the set theory in greater depth was shelved until taken up again late last year. Abstract: The semantics of possible worlds for intuitionistic logic gives rise to Kripke models and its variant, Beth models. Although both semantics can be shown to be complete, we will see that the later has advantages over the former in the sense that with a weaker metatheory, they describe a wider variety of categorical models.
Buy The Foundations of Analysis: A Straightforward Introduction: Book 1 Logic, Sets and Numbers on ykoketomel.ml ✓ FREE SHIPPING on qualified orders. Buy The Foundations of Analysis: A Straightforward Introduction: Book 1 Logic, Sets and Numbers by K. G. Binmore online on ykoketomel.ml at best prices.
In particular, we will discuss how to build, for every model in a Grothendieck topos, an elementarily equivalent Beth model, as well as the constructive aspects and applications of this. Abstract: The idea of Universal Grammar as the hypothetical linguistic structure shared by all human languages harkens back at least to the 13th century.
The best known modern elaborations of the idea are due to Chomsky e. Following a devastating critique from theoretical e. Jackendoff , typological e. Everett linguistics, these elaborations, the idea of Universal Grammar itself and the more general idea of language universals stand untenable and are largely abandoned. The talk will show how to tackle the hypothetical structure of Universal Grammar using dependent type theory in a framework very different from the Chomskyan ones.
Abstract: Coherence constructions are a vexing technical hurdle which most models of dependent type theory, especially homotopical ones, have to tackle in some way. Abstract: In this talk we first briefly survey recent research on complexity and expressivity of weak fragments of first-order logic. Such fragments include, e.
We then discuss the recently introduced uniform one-dimensional fragment UF1 , which generalizes the standard two-variable fragment in a way that leads to the possibility of defining non-trivial properties of relations of arbitrary arities. The work presented is joint work with Emanuel Kieronski and Lauri Hella. The usual equations characteristic of games follow from the nbe construction without reference to the game-theoretic machinery.
As an illustration, we give a Haskell program computing the application of innocent strategies. On the theories of almost sure validities in the finite in some fragments of monadic second-order logic. Abstract: This work stems from the well-known law for the asymptotic probabilities of first-order definable properties of finite graphs in general, relational structures. Both the transfer theorem and the law hold in some non-trivial extensions of first-order logic e.
The main problem of this study is to characterise, axiomatically or model-theoretically, the set of almost surely valid in the finite formulae of MSO, i. The set of almost sure validities in the finite of any given logical language where truth on finite structures is well-defined is a well-defined logical theory, containing all logical validities of that language and closed under all sound finitary rules of inference. Beyond that, little is known about these theories in cases where the transfer theorem fails.
The talk will begin with a brief introduction to asymptotic probabilities and almost surely true properties of finite graphs, the laws for first-order logic and in some extensions of it, and their relationship with the respective logical theories of infinite random graphs. Identifying explicitly the set of target finite graphs that generate almost surely valid characteristic formulae seems a quite challenging problem, to which we so far only provide some partial answers and conjectures. Abstract: The lambda-Pi calculus modulo is an extension of the lambda-Pi calculus with rewrite rules.
Using the Curry-Howard correspondence, it can be used as a logical framework to define logics and express proofs in those logics in a way that preserves the reduction semantics. I will show how we embed various theories such as the calculus of constructions and simple type theory in the lambda-Pi calculus modulo rewriting and consider the soundness and completeness of these embeddings. Abstract: The continuum hypothesis, CH, asserts that an infinite set of reals is either countable or of the same size as the entire continuum so nothing exists in between. With the work of Cohen in the s, many people were convinced that CH was now a settled problem as all essential facts about it were deeply understood.
This situation, however, did not convince everybody e. The last 20 years or so have seen a resurgence of the problem in modern set theory, in particular in the work of Woodin. In the first part of the talk, I plan to survey the most important known facts about CH. This will be followed by relating some of the modern attempts at finding new axioms to settle CH. Their justifications are partly based on technical results but also tend to be highly speculative and often come enmeshed with essentially philosophical arguments.
He has advocated an account of the set-theoretic universe which allows him to distinguish between definite and indefinite mathematical problems. His way of formally regimenting this informal distinction is by employing intuitionistic logic for domains for which one is a potentialist and reserving classical logic for domains for which one is an actualist. This framework allowed him to state a precise conjecture about CH, which has now been proved. I plan to indicate a rough sketch of the proof.
The categorical setting also highlights a close connection between Paulin-Mohring rules for the identity type and the identity type w. Some familiarity with dependent type theory and basic homotopy theory will be assumed. Abstract: We will explain how abstract elementary classes are situated among accessible categories and how model theory can be extended from the former to the latter. In particular, we will deal with categoricity, saturation, Galois types and tameness. Abstract: The main stream in machine translation is to build systems that are able to translate everything, but without any guarantees of quality.
An alternative to this is systems that aim at precision but have limited coverage. Combining wide coverage with high precision is considered unrealistic. Most wide-coverage systems are based on statistics, whereas precision-oriented domain-specific systems are typically based on grammars, which guarantee translation equality by some kind of formal semantics. This talk introduces a technique that combines wide coverage with high precision, by embedding a high-precision semantic grammar inside a wide-coverage syntactic grammar, which in turn is backed up by a chunking grammar.
The levels of confidence can be indicated by using colours, whence the title of the talk. The talk will explain the main ideas in this technique, based on GF Grammatical Framework and also inspired by statistical methods probabilistic grammars and the Apertium system chunk-based translation , boosted by freely available dictionaries WordNet, Wiktionary , and built by a community of over 50 active developers.
The current system covers 11 languages and is available both as a web service and as an Android application. Abstract: What is Homotopy Type Theory? Many things! It can be seen as non-exhaustively : - various new homotopy-theoretic models of traditional dependent type theory; - a complex of new concepts and definitions within traditional dependent type theory, motivated by these models; - new axioms univalence, HIT , motivated by these models; - the use of DTT with these new axioms as a foundation for mathematics.
I will survey all of these, with a focus on the second point: concepts such as truncatedness and connectedness, which are motivated by homotopy-theoretic models, but can already be expressed and developed in plain dependent type theory. In this talk, I will assume some familiarity with dependent type theory. Spring 11 June Of course, we want this language to have a rich notion of data structure as well. Since both types are formed inductively, we call such definitions inductive-inductive definitions.
Examples of inductive-inductive definitions — e. Abstract: We present a soundness theorem for a dependent type theory with context constants with respect to an indexed category of finite, abstract simplical complexes. From a computer science perspective, the interesting point is that this category can be seen to represent tables in a natural way.
Thus the category is a model for databases, a single mathematical structure in which all database schemas and instances of a suitable, but sufficiently general form are represented. The type theory then allows for the specification of database schemas and instances, the manipulation of the same with the usual type-theoretic operations, and the posing of queries.
This is joint work with David I. Spivak MIT. Abstract: We propose a system for the interpretation of anaphoric relationships between unbound pronouns and quantifiers. The main technical contribution of our proposal consists in combining generalized quantifiers with dependent types. Abstract: Monads, Lawvere Theories, and Operads provide different means to define algebraic structures in categories. Under some natural assumptions they present the same algebraic structures in the category Set.
Symmetric operads were originally introduced to study the geometry of loop spaces, whereas analytic and polynomial monads were introduced to study enumerative combinatorics. They have been applied with a success to combinatorial problems related to higher-dimensional categories. In my talk I will discuss the subcategories of categories of monads on Set Lawvere theories, and operads in Set that correspond to various classes of equational theories relevant for combinatorics and geometry.
This raises the question of how far one can get in proving completeness constructively but without using the fan theorem. We show that the disjunction-free fragment is constructively complete without appeal to the fan theorem, and also without placing restrictions on the decidability of the theories and the size of the language. Along the way we show how the completeness of Kripke semantics for the disjunction free fragment is equivalent, over IZF, to the Law of Excluded middle. Abstract: Dependently typed or sorted first-order logics were introduced and studied by M.
Makkai , P. Aczel and N. Gambino and J. Belo Belo gave a completeness theorem for an intuitionistic version of such a logic with respect Kripke semantics. In this talk we consider a more general semantics based on categories with families. This seminar is a continuation from September Abstract: Multisets, like sets, consist of elements and the order of appearance of these elements is irrelevant. What distinguishes multisets from sets is the fact that number of occurrences of an element matters. First, I will present a technical result on the identity types of W-types in type theory without the Univalence Axiom.
I will also present an axiomatic approach to multisets - based on a "translation" of the axioms of CZF. Abstract: Simplicial sets give, classically, a wonderful model for homotopy theory, and a very satisfying interpretation of type theory. In constructive settings, however, this theory breaks down in several ways.
These are in some ways harder to work with than simplicial sets, but constructively, they seem much better-behaved. I will show how to construct classically a right semi-model structure on semi-simplicial sets, and a resulting model of type theory; and I will discuss the issues involved in attempting to constructivise these results. In this paper I show that there is no genuine paradox of logical validity. Along the way a number of rather important, rather more general, lessons arise, including: i Whether or not an operator is logical depends not only on what content that operator expresses, but the way that it expresses that content e.
As a result, there is no paradox of logical validity. More importantly, however, these observations lead to a number of novel, and important, insights into the nature of validity itself. The completeness of Kripke semantics in constructive reverse mathematics. Abstract: We will consider in intuitionistic set theory the completeness theorem with respect to Kripke semantics and analyze its strength from the point of view of constructive reverse mathematics. We will prove that, over intuitionistic Zermelo-Fraenkel set theory IZF, the strong completeness of the negative fragment of intuitionistic first order logic is equivalent, in the sense of interderivability, to all instances of the law of the excluded middle, and that the same result holds for the disjunction-free fragment.
On the other hand, we will prove that the strong completeness of full intuitionistic first order logic is equivalent, over IZF, to all instances of the law of the excluded middle plus the Boolean prime ideal theorem. Finally, we will mention aspects of a joint work in progress with Henrik Forssell on the categorical analysis of modified Kripke semantics and how it could be used to derive in IZF completeness proofs for the disjunction-free fragment. Abstract: This talk reports on joint work with Albert Visser Utrecht. I will present a robust technique for building a wide variety of full satisfaction classes using model-theoretic ideas.
Abstract: We present a model of Type Theory where a type is interpreted as a cubical set satisfying the Kan condition. We use cubical sets with non ordered dimensions, and explain the connection with the notion of nominal sets. Finally, we show how to use this model to give a new explanation of the axiom of description. Abstract: The notion of weak infinity-groupoid was originally developed by Grothendieck with the hope of providing an algebraic model for spaces up to homotopy. This notion has also recently come up in type theory with the proof by van den Berg, Garner and Lumsdaine that every type in dependent type theory has the structure of a weak infinity-groupoid using the definition of Batanin-Leinster.
Abstract: We outline some preliminary investigations into using locale-theoretic methods in constructive model theory, methods springing from such results as the sheaf-theoretic representation and cover theorems of Joyal and Tierney and formal space valued completeness theorems of e. Coquand and Smith. The width of such a derivation is defined as the width of the thickest ordered equation appearing in it. Additionally we will define the notion of b-bounded substitution. We say that a proof is b- bounded if every substitution rule used on it is b-bounded.
Investigations into a model of type theory based on the concept of basic pair. Presentation of MSc Thesis. This means to extend the concept of "set" in the easiest and most natural way: transforming it in a couple of sets and an arbitrary relations set between them, i. Our purpose will be to find a model which satisfies this interpretation, and we will look for it following two different approaches. The first one is meant to remain inside the standard type theory constructing an internal model; the second one, arisen from some impasses reached in the development of the first attempt, is aimed at adding new type constructors at the standard theory, extending it and allowing us to create an external model.
These new types, that we have denoted here with a star, have to be seen like an arbitrary relations set between two set of the same type without star. This extended theory will give us all the results needed in a natural way, and might be useful in different interpretations for further research. The theory of apartness spaces, a counterpart of the classical one of proximity spaces, provides one entry to a purely constructive i. The canonical example of apartness spaces are metric spaces, locally convex spaces, and, more generally, uniform spaces.
The general theory of apartness and its specialisation to uniform spaces has been studied extensively since , and is expounded in detail in . In this talk we first present the basic notions of apartness between sets and of uniform structures, and point out the connection between point-set apartness spaces and neighbourhood spaces.
We then introduce u-neighbourhood structures, which lie somewhere between topological and uniform neighbourhood structures. Reference  D. Bridges and L. We will argue for a close proximity between his proposals and those of Category Theory. We will also pay close attention to the differences between his elucidations and the ones offered by contemporary Intuitionism.
We will consider two schemata, the double negation shift DNS and the one consisting of instances of the principle of excluded middle for sentences REM. We will prove that both schematas combined derive classical logic, while each one of them provides a strictly weaker intermediate logic, and neither of them is derivable from the other. The partiality monad is an attempt to give an abstract descriptions of what a computation is, using categorical language. From any monad we can construct its Kleisli category, and in the case of the partiallity monad the Kleisli category models partial, computable functions.
Tamminga, Bayreuth and Groningen. Correspondence analysis for many-valued logics joint work with Barteld Kooi. Abstract: Taking our inspiration from modal correspondence theory, we present the idea of correspondence analysis for many-valued logics. As a benchmark case, we study truth-functional extensions of the Logic of Paradox LP. First, we characterize each of the possible truth table entries for unary and binary operators that could be added to LP by an inference scheme. Second, we define a class of natural deduction systems on the basis of these characterizing inference schemes and a natural deduction system for LP.
Third, we show that each of the resulting natural deduction systems is sound and complete with respect to its particular semantics. Under sufficiently concrete circumstances this may even yield a constructive proof without any form of the Axiom of Choice. To prepare the ground for a more systematic treatment we now classify the cases that can be found in mathematical practice by way of representative proof patterns. Our version subsumes not only instances from diverse branches of abstract algebra but also a Henkin-style completeness proof for first-order logic.
By recurrence to a theorem of McCoy, Fuchs, and Schmidt on irreducible ideals we further shed light on why prime ideals occur — and why transfinite methods. This is joint work in progress with D. Rinaldi, Munich, and is partially based on joint work with N. Gambino, Palermo, and F.
Ciraulo, Padua. Schuster, Induction in algebra: a first case study. Hendtlass, P. In: S. Cooper, A. Dawar, B. Loewe, eds. Springer, Berlin and Heidelberg. Notes Comput. Actually a much more general notion of semantic attribute is motivated by strategic considerations. When identifying such a generalization, the notion of classical negation plays a crucial role.
Thursday, 8 November Abstract: We consider a database model based on finite simplicial complexes rather than relations. A brief introduction to the relational model is given for logicians not familiar with it. We thereafter describe how simplicial complexes can be used to model both database schemas and instances. This allows us to collect schemas and instances over them into one structure, which we relate to the notions of display map and comprehension category. Abstract: Topos theory is a very successful chapter in the categorical analysis of logic. Elementary toposes are categorical models of higher-order intuitionistic arithmetic and provide a framework for almost all interpretations of this theory, such as realizability, Kripke , topological and sheaf models.
The notion of a topos is impredicative, however, and therefore its internal logic is much stronger than what most constructivists are willing to use in their work. A predicative topos should be a topos-like structure whose internal logic has the same strength as these theories. I will explain what the difficulties are in coming up with a good notion of predicative topos, discuss two possible axiomatisations very closely related , explain why I like these and then discuss their basic properties.
We give a survey of some of the most important ones. Abstract: In this talk I shall argue that program testing provides the basis for constructive mathematical truth. In particular, we get a new interpretation of hypothetical judgments, since tests for such judgments need methods for generating inputs. In order to test a typing judgement we simultaneously play a strategy generated by a type and a strategy generated by a term, where the correct moves for the strategy of the term are determined dynamically by playing the strategy of the type.
The talk is on joint work in progress with Pierre Clairambault, Cambridge. Abstract: The standard notion of compositionality is well understood, and the formal framework of Hodges allows precise treatment of various features of compositionality. The issue in this talk is how context-dependence, i. Contexts here can be assignments, utterance situations or features thereof s. Currying the context argument results in a function taking only expressions as arguments, and one question concerns the relation between compositionality of the curried function and that of the uncurried one.
Another question is the relation between a compositional semantics and one given by a standard inductive truth definition. The background to these questions is linguistic, but in this talk I focus on the mathematical details. Reuniting the antipodes: bringing together Constructive and Nonstandard Analysis.
Bishop famously derided Nonstandard Analysis for its lack of computational meaning. Abstract: Given a theory T and its category of models and homomorphisms Mod T , is it possible to recover T from Mod T up to some suitable notion of equivalence? A positive answer for regular theories was given by Makkai who showed that the classifying topos of a regular theory - from which the theory can be recovered - can be represented as filtered colimit preserving functors from Mod T to the category SET of sets and functions.
Moving to coherent and classical first-order theories , however, it becomes necessary to equip Mod T with some extra structure. While Makkai uses structure based on ultra-products for this case, it is possible to equip Mod T with a natural topology and represent the classifying topos of T as equivariant sheaves on the resulting topological category or groupoid, considering just the isomorphisms. This forms the basis of an extension of Stone duality to first-order theories, and allows for the application of topos-theoretic techniques to e.
The expressive power of dependence logic coincides with that of existential second-order logic In the past few years, dependence logic has grown into a new framework in which various notions of dependence and independence can be formalized. The high expressive power of dependence logic has a consequence that dependence logic in full generality cannot be axiomatized.
Feferman, S. Copi, Irving. I will describe at least two applications: firstly, that once the interpretation of contexts, types, and terms has been fixed, the interpretation of the rest of type theory is determined up to equivalence; and secondly, a presentation of the "homotopy theory of models of type theory". Mathematical proofs can only connect purely mathematical notions—or so it seems. Solovay b.
However, first-order consequences of dependence logic sentences can be axiomatized. We give an explicit axiomatization and prove the respective Completeness Theorem. Abstract: Standard models of dependent type theory, such as categories with attributes, use semantics based on the category of sets. It seems to be an unresolved question how to best formulate such semantics using the category of setoids instead, and how to do this internally to type theory it self.
We discuss some proposals for solutions and possible generalizations to other categories of interpretation. A proof of the latter kind typically follows a certain pattern, and may be extracted from a proof of the former sort. If the theorem has finite input data, then a finite order carries the required instance of induction, which thus is provable by fairly elementary means. Basic proof theory suffices to eliminate the decidability assumptions one may have to make en route.
The tree one can grow alongside the induction encodes an algorithm which computes the desired output data. We will discuss all this along the lines of some typical examples. Friday, 24 February Note time and day! Mohammad Jabbari SU. Abstract: Algebra is a subject dealing with variables, operations and equations. In his thesis, among other things, he objectified algebraic "theories" as special kind of categories and algebraic structures as special set-valued functors on them.
He learned to do universal algebra by category theory! This was the start of a fruitful line of discoveries which culminated in the creation of elementary topos theory by Lawvere and M. Tierney in late At about the same time, an alternative categorical approach to general algebra emerged out of the collective efforts of some homological algebraists Godement, Huber, Eilenberg, Beck, Among other intuitions, this machinery enables us to grasp the algebraic part of an arbitrary category. This approach later found applications in descent theory and computer science.
This seminar is divided into three parts. Then we shortly describe the monadic approach to algebra. Finally we state some theorems about the equivalences between these three approaches into algebraic categories. Main References. To each small category it is possible to associate a homotopy type. This is done through the nerve functor; via this functor it is thus possible to transport homotopy notions from simplicial sets or topological spaces to the category of small categories.
In this talk we introduce the notion of essentially small categories; such a category, albeit "big", allows for a nerve construction — an essentially small category has a small subcategory that serves as a good homotopy approximation for its ambient category; we give several concrete examples of "big" but essentially small categories and their homotopy types.
We also explain how to do categorical homotopy theory on "big" categories in general. In this talk we will discuss applications of methods in categorical logic to model theory, which was the theme of my undergraduate thesis. The proof makes use of functorial semantics, as introduced by Lawvere, to translate into categorical language the existing proof of completeness.
We will also comment on the constructive aspects of the categorical proofs, as well as to what extent they can dispense with choice principles. This talk is based on my Master Thesis. I will give a brief introduction to the notion of a container and how to differentiate them, as defined in  and . This yields combinatorial information from the process of differentiation a la combinatorial species. Then I show how a more general notion, called symmetric containers, contains anti-derivatives of many containers. Containers: constructing strictly positive types, Theoretical Computer Science 1 , 3 - Bridges, University of Canterbury,.
Abstract: In , Luminita Vita and I began investigating axioms for a constructive theory of apartness between points and sets, and between sets and sets, as a possible constructive approach to topology. The culmination of this work came last October, with the publication of the monograph , in which we lay down what we believe to be the defnitive axioms for pre- apartness and then develop the theory, with particular application to quasi- uniform spaces. As with analysis, so with topology: once the "right" axioms are used, the theory allows in a natural, if technically nontrivial, way with one exception.
In this talk I shall present the axioms for apartness and uniform spaces, and discuss various aspects of the resulting theory, paying particular attention to the connections between various types of continuity of functions. References  E. Bridges, Constructive Analysis, Grundlehren der Math. Wissenschaften , Springer Verlag, Heidelberg, Fall December 12, Minisymposium on categorification and foundations of mathematics and quantum theory.
We will survey this technique, provide a short comparison with the related work by Isham and co-workers, which motivated Bohrification, and use sites and geometric logic to give a concrete external presentation of the internal locale. The points of this locale may be physically interpreted as partial measurement outcomes. In this talk I plan to explain what a 2-category is, how 2-categories appear in algebra and topology, how one can study them, in particular, how one constructs 2-representations of 2-categories, and, finally, how all this can be applied. When extending the notion of setoid type with an equivalence relation to families of setoids, a choice between proof-relevant or proof-irrelevant indexing appears.
It is shown that a family of types may be canonically extended to a proof-relevant family of setoids via the identity types, but that such a family is in general proof-irrelevant if, and only if, the proof-objects of identity types are unique. A similar result is shown for fibre representations of families.
The ubiquitous role of proof-irrelevant families is discussed. Two lists are "bag equal" if they are permutations of each other, i. I will describe how one can define bag equality as the presence of bijections between sets of membership proofs. This definition has some nice properties: Many bag equalities can be proved using a form of equational reasoning. The definition generalises easily to arbitrary unary containers, including types with infinite values, such as streams. By using a small variant of the definition one gets set equality instead, i. Other variations give the subset and subbag preorders.
Many preservation properties? The definition works well in mechanised proofs. Abstract: I will first briefly overview the problem of logical constants what is it that makes a symbol logical? But given a consequence relation, is there a natural way to extract from it a set of logical constants? I compare two ways of doing so, one purely syntactical, based on the idea that an expression is logical if it is essential to the validity of at least one inference, and one semantical, based on the idea that an expression is logical if its interpretation is fully determined by the rules for its use.
To describe these methods, Galois connections between consequence relations, sets of symbols, and sets of interpretations all ordered under inclusion play an important role. Furthermore the Univalence Axiom and its consequences are discussed. This includes how groupoids and n-groupoids appear naturally as a framework for equality when one wants to retain as much information as possible but also how homotopy and in particular homotopy coherence comes into the picture.
Summary: The semantics of data types within computer science is often given using initial algebra semantics. However, not all functors have initial algebras and even those that do often lack good properties. As a result, a number of formalisms have been invented to capture those functors which give rise to initial algebras with good properties. Containers are one such formalism and this series of talks concerns them.
One of the pleasant features I will mention is the smooth generalisation of containers to indexed containers and then to inductive recursive definitions. Infinite sequences and infinite series, power series and convergence criteria, Taylor series. Ordinary differential equations. Systems of linear equations, and the Gaussian elimination algorithm. Matrices, and their inverses and determinants. Vector spaces, subspaces, linear independence, basis, dimension, row and column spaces, rank, linear transformations, eigenvectors, eigenvalues, and diagonalization.
Inner products, inner product spaces, orthonormal sets, the Gram-Schmidt process, and Fourier series. Calculus of several variables: Partial derivatives, limits and continuity, chain rule, directional derivatives, gradients, and Lagrange multipliers.
Double integrals, and the calculation of the area of a surface; triple integrals. Vector calculus, line integrals, Green's Theorem, surface integrals, Gauss's divergence theorem, and Stokes' Theorem. Prerequisite: CY or RE Introductory course on differential and integral calculus.
Real numbers, functions, their inverses and graphs. Trigonometric and inverse trigonometric functions, logarithms and exponentials, and hyperbolic functions. Limits of functions, continuity at a point, and continuity on an interval. Differentiability, derivatives of functions, the chain rule, implicit differentiation, derivatives of higher order.
Local maxima and local minima, Rolle's Theorem and the Mean Value Theorem, points of inflection, first-derivative and second-derivative tests, L'Hospital's Rule. Antidifferentiation, indefinite integrals, substitution rule, and integration by parts. Prerequisite: A level Mathematics or equivalent. Further topics in calculus. Definite integrals; the Fundamental Theorems of Calculus. Area of plane regions, volumes of solids, length of arcs. Mean Value Theorem for integrals, and other applications of the definite integral.
Techniques of integration, numerical integration, and improper integrals. Power series: differentiation and integration of power series, Taylor series, binomial series, and Fourier series. MH - Linear Algebra I. Introductory course on linear algebra. Systems of linear equations; Gaussian elimination. Matrices, inverses, and determinants. Vectors, dot products, and cross products. Vector spaces, subspaces, linear independence, basis, dimension, row and column spaces, and rank. Prerequisite: A or H2 level Mathematics or equivalent. Further topics in linear algebra. Linear transformations, kernels and images.
Inner products, inner product spaces, orthonormal sets, and the Gram-Schmidt process. Eigenvectors and eigenvalues; matrix diagonalization and its applications. Symmetric and Hermitian matrices. Quandratic forms and bilinear forms; Jordan normal form and other canonical forms. Prerequisite: MH MH - Foundations of Mathematics. Introductory course on core mathematical concepts, including logic and the theory of sets. Elementary logic, mathematical statements, and quantified statements. Sets, operations on sets, Cartesian products, and properties of sets.
Natural numbers, integers, rational numbers, real numbers, and complex numbers. Relations, equivalence relations, and equivalence classes. Functions, injective and surjective functions, inverse functions, and composition of functions. Division algorithm, greatest common divisor, Euclidean algorithm, fundamental theorem of arithmetic, modulo arithmetic.
MH - Discrete Mathematics. Introductory course on discrete mathematics. Counting, permutations and combinations; the binomial theorem. Recurrence relations. Graphs, paths and circuits, and isomorphisms. Trees and spanning trees. Graph algorithms e. MH - Algorithms and Computing I.
Core course introducing fundamentals of programming including variables, data types, control statements, iteration, and recursion , using the Python programing language. By emphasizing applications to problem-solving, it develops the ability to think algorithmically, which is essential for any professional working in an increasingly computer-driven world. This course is required for future computing courses and for courses using Python as a supporting tool.
No prior programming experience is required. Further topics in algorithms and computing. The concept of an algorithm.
Debugging and good programming style. Vectors and arrays. Algorithms for searching and sorting vectors and arrays. Basic concepts of algorithm efficiency. Recursion and the divide-and-conquer paradigm. Emphasis is on data abstraction issues in the program development process, and on the design of efficient algorithms. Simple algorithmic paradigms such as greedy algorithms, divide-and-conquer algorithms and dynamic programming will be introduced. Elementary analyses of algorithmic complexities will also be taught.
Prerequisite: PS First of two courses on calculus for students in the sciences. Applications and computer-based learning are included. Topics covered include: Functions and graphs, real numbers Differentiation of functions of one variable, derivative as rate of change, chain rule, implicit functions, and inverse functions. Local maxima and minima. Indefinite and definite integrals, and applications of integration. Methods of integration. Fundamental theorem of calculus. Second of two courses on calculus for students in the sciences. Topics covered include: Differential equations; first-order and second-order linear differential equations.
Techniques of solving differential equations, and applications. Series and power series. Taylor's series. Fourier series. Prerequisite: MH or equivalent. MH - Calculus for the Sciences. Introductory course in calculus, for students majoring in the physical sciences. Differential calculus. Integral calculus. Differential equations. MH - Calculus for Physics. Additional topics in calculus, for students majoring in physics.
Vectors and multivariable calculus. Vector analysis. Partial differential equations. MH - Mathematics for Chemistry. Additional topics in calculus, for students majoring in chemistry. Cartesian and spherical coordinates. Complex numbers. Vectors; linear algebra and matrices. Summation, series, and expansions of functions. Intermediate course in calculus. Parametric equations; polar coordinates. Vector-valued functions, calculus of vector-valued functions, and analytic geometry. Functions of more than one variable, limits, continuity, partial derivatives, differentiability, total differentials, the chain rule, and the implicit function theorem.
Directional derivatives, gradients, and Lagrange multipliers. Double integrals; the area of a surface; triple integrals. Line integrals, Green's Theorem, surface integrals, the Gauss divergence theorem, and Stokes' Theorem. Introductory course on group theory, with emphasis on symmetry groups of geometric structures. Symmetries of 2D and 3D objects e. Group axioms. Cyclic and dihedral groups. Permutation groups. Representation of rotations and reflections by matrices. Wallpaper groups. Puzzles especially puzzle and Rubik's cube. Application of computing skills and previously-learnt mathematical topics Linear Algebra, Calculus, Discrete Mathematics, etc.
This course emphasizes group project work, and assessments are based substantially on a term project. Introductory course on probability and statistics. Discrete distributions binomial, hypergeometric and Poisson. Continuous distributions normal, exponential and densities. Random variables, expectation, independence, conditional probability. The law of large numbers and the central limit theorem.
Sampling distributions. Elementary statistical inference confidence intervals and hypothesis tests. Techniques in linear algebra and multivariable calculus, and their applications. This course includes computer-based learning. Topics covered include: Systems of linear equations. Matrices and determinants. Vectors in 2- and 3-dimensional Euclidean spaces; Vector spaces, linear independence, basis, and dimension. Linear transformations. Eigenvectors and eigenvalues. Calculus of functions of several variables; partial derivatives. Constrained and unconstrained optimization.
MH - Complex Methods for the Sciences. Introduction to complex numbers and their applications in physics and the other sciences. Complex numbers, the argand diagram, modulus and argument. Complex representations of waves and oscillations. Functions of a complex variable, analyticity, and the Cauchy-Riemann equations. Contour integration, Cauchy's integral formula, and the residue theorem.
Fourier series and Fourier transformations, and their applications. Green's functions methods. MH - Linear Algebra for Scientists. Introduction to linear algebra and its applications in physics and the other sciences. Vector algebra and analytical geometry. Linear spaces. Linear transformations and matrices. Eigenvalues and eigenvectors. Applications of linear algebra to problems in physics and computing. MH - Real Analysis I. Basic properties of real numbers, supremum and infimum, completeness axiom, open and closed sets, compact sets, countable sets.
Limits and convergence of sequences, subsequences, Bolzano-Weierstrass theorem, Cauchy sequences, infinite series, double summations, products of infinite series. Limits of functions, continuity, uniform continuity, intermediate value theorem, extreme-value theorem. Differentiability, derivatives, intermediate value property, Cauchy mean value theorem, Taylor's theorem, Lagrange's form of the remainder. Sequence and series of functions, uniform convergence and differentiation.
Power series, radius of convergence, local uniform convergence of power series. MH - Complex Analysis. Analytic functions of one complex variable, Cauchy-Riemann equations. Contour integrals, Cauchy's theorem and Cauchy's integral formula, maximum modulus theorem, Liouville's theorem, fundamental theorem of algrebra, Morera's theorem. Taylor series, Laurent series, singularities of analytic functions. Residue theorem, calculus of residues.
Fourier transforms, inversion formula, convolution, Parseval's formula. First order equations, exact equations, integrating factors, separable equations, linear homogeneous and non-homogeneous equations, variation of parameters, principle of superposition. Second order equations, Wronskian, Abel's formula, variation of parameters, exact equations, adjoint and self-adjoint equations, Lagrange and Green's identities, Sturm's comparison and separation theorems.
First order linear systems, Wronskian, Abel's formula, variation of parameters, systems with constant coefficients. First order nonlinear systems, initial value problem. Use of ODEs in simple modeling problems. MH - Abstract Algebra I. Introduction to modern algebra, including basic algebraic structures such as groups, rings and fields. Topics covered include: Groups, subgroups, cyclic groups, groups of permutations, cosets, Lagrange's Theorem, homomorphism, and factor groups. Rings and fields, ideals, integral domains, quotient fields, rings of polynomials, factorization of polynomials over a field.
Introduction to basic number theory, including modern applications. Topics covered include: Review of modular arithmetic; the Chinese remainder theorem, Fermat's little theorem, and Wilson's theorem. Primitive roots and indices. Legendre's symbols; the quadratic reciprocity law. Continued fractions and Pell's equations. Primality tests and factorization of integers, and the RSA cryptosystem. Connectivity and matchings, Hall's theorem, Menger's theorem, network flows. Graph colouring and the four-colour theorem. Ramsey theory.
Probabilistic methods in graph theory. Use of software to solve graph-theoretic problems. Prerequisites: MH and MH Games of normal form and extensive form, and their applications in economics, relations between game theory and decision making. Games of complete information: static games with finite or infinite strategy spaces, Nash equilibrium of pure and mixed strategy, dynamic games, backward induction solutions, information sets, subgame-perfect equilibrium, finitely and infinitely-repeated games.
Games of incomplete information: Bayesian equilibrium, first price sealed auction, second price sealed auction, and other auctions, dynamic Bayesian games, perfect Bayesian equilibrium, signaling games.
Cooperative games: bargaining theory, cores of n-person cooperative games, the Shapley value and its applications in voting, cost sharing, etc. MH - Algorithms for the Real World. Mathematical concepts for analysis of algorithms. Fundamental algorithm design techniques, with applications to various problems: network algorithms, matrix algorithms, optimization algorithms, and algorithms for data analysis and machine learning. Applications to problems in combinatorial optimization, networks, operations research, data analysis and machine learning.
MH - Statistics. Probability distributions of functions of random variables, the law of large numbers and the central limit theorem. Point and interval estimation, optimal estimation, maximum likelihood methods. More on tests of hypotheses; the Neyman-Pearson lemma, likelihood ratio tests, large sample theory, Chi-square tests and contingency tables. Introduction to regression analysis, one of the most widely-used statistical techniques. Simple and multiple linear regression, nonlinear regression, analysis of residuals and model selection.