This page contains a list of the accepted papers (sorted by first author) with abstracts, and links to PDFs of the papers, slides, and video recordings of the talks. The official EPTCS proceedings can be found here.

Long Presentations

  • Daniela Ashoush and Bob Coecke. Dual Density Operators and Natural Language Meaning (paper, bibtexslides, video)
Density operators allow for representing ambiguity about a vector representation of systems, both in quantum theory and in distributional natural language meaning. Equivalently, they allow for discarding part of the description of a composite system, where we consider the discarded part to be the environment, or context. We introduce dual density operators, which allow for two independent notions of context, and two corresponding discarding operations. By means of a toy example, we show that dual density operators can be used to represent both ambiguity about word meanings as well as lexical entailment, within a grammatical-compositional distributional framework for natural language meaning. Based on the corresponding category-theoretic construction of dual density operators, we axiomatically describe the two independent notions of context.
  • Josef Bolt, Bob Coecke, Fabrizio Genovese, Martha Lewis, Dan Marsden and Robin Piedeleu. Interacting Conceptual Spaces (paperbibtexslides, video)
We propose applying the categorical compositional scheme of Coecke, Sadrzadeh and Clark to conceptual space models of cognition. In order to do this we introduce the category of convex relations as a new setting for categorical compositional semantics, emphasizing the convex structure important to conceptual space applications. We show how conceptual spaces for composite types such as adjectives and verbs can be constructed. We illustrate this new model on detailed examples.
  • Dimitri Kartsaklis. Coordination in Categorical Compositional Distributional Semantics (paperbibtexslides, video)
An open problem with categorical compositional distributional semantics is the representation of words that are considered semantically vacuous from a distributional perspective, such as determiners, prepositions, relative pronouns or coordinators. This paper deals with the topic of coordination between identical syntactic types, which accounts for the majority of coordination cases in language. By exploiting the compact closed structure of the underlying category and Frobenius operators canonically induced over the fixed basis of finite-dimensional vector spaces, we provide a morphism as representation of a coordinator tensor, and we show how it lifts from atomic types to compound types. Linguistic intuitions are provided, and the importance of the Frobenius operators as an addition to the compact closed setting with regard to language is discussed.
  • Mehrnoosh Sadrzadeh. Quantifier Scope in Categorical Compositional Distributional Semantics (paperbibtexslides, video)
In previous work, we modelled generalised quantifiers in categorical compositional distributional semantics with the help of bialgebras. In this paper, we show how quantifier scope ambiguity arises in that setting. We also briefly discuss how quantifier branching can be modelled.
  • William Zeng and Bob Coecke. Quantum Algorithms for Compositional Natural Language Processing (paperbibtexslides, video)
We propose a new application of quantum computing to the field of natural language processing. Ongoing work in this field attempts to incorporate grammatical structure into algorithms that compute meaning. In [1], Clark et al. introduce such a model (the CCS model) based on tensor product composition. While this algorithm has many advantages, its implementation is hampered by the large classical computational resources that it requires. In this work we show how computational shortcomings of the CCS approach could be resolved using quantum computation. We address the value of a qRAM [2] for this model and extend an algorithm from Wiebe et al. [3] into a quantum algorithm to categorize similar sentences in CCS. Our new algorithm demonstrates a quadratic speedup over classical methods under certain conditions.

Short Presentations

We construct an abstract categorical model for DisCoCat starting from a generic corpus annotated with constituent structure trees. We begin by dividing words in the corpus according to three semantic functions: (i) object words, directly modelled in the semantic space; (ii) modifier words, acting on individual object words; (iii) interaction words, connecting the meaning of distinct object words. We then consider the compact closed symmetric monoidal category of R-semimodules over an involutive commutative semiring R, and we model object words as vectors in a certain free R-semimodule H, constructed from the corpus. Based on the grammatical structure annotating the corpus, we use Frobenius algebras to model modifier words as unary operators on H, and interaction words as binary operators on H. We discuss some possible future extensions of this model.
  • Stephen McGregor, Matthew Purver and Geraint Wiggins. Words, Concepts, and the Geometry of Analogy (paperbibtexslides, video)
This paper presents a geometric approach to the problem of modelling the relationship between words and concepts, focusing in particular on analogical phenomena in language and cognition. Grounded in recent theories regarding geometric conceptual spaces, we begin with an analysis of existing static distributional semantic models and move on to an exploration of a dynamic approach to using high dimensional spaces of word meaning to project subspaces where analogies can potentially be solved in an online, contextualised way. The crucial element of this analysis is the positioning of statistics in a geometric environment replete with opportunities for interpretation.
In this paper we give an overview of partial orders on the space of probability distributions that carry a notion of information content and serve as a generalisation of the Bayesian order given in [7]. We investigate what constraints are necessary in order to get a unique notion of information content. These partial orders can be used to give an ordering on words in vector space models of natural language meaning relating to the contexts in which words are used, which is useful for a notion of entailment and word disambiguation. The construction used also points towards a way to create orderings on the space of density operators which allow a more fine-grained study of entailment. The partial orders in this paper are directed complete and form domains in the sense of domain theory.