logo 19th European Summer School in Logic, Language and Information

Evening lectures

Evening Lectures

Week 1: Tuesday, 7 August

Speaker: Alexander Koller

Title: Some thoughts about a computational semantics for the 21st century

Abstract: The history of computational natural-language semantics over the past thirty years can be seen as a success story in implementing formal semantics on the computer. At the beginning of the 21st century, we have access to well-understood representation formalisms, semantics construction algorithms based on large-scale grammars, we have methods of dealing very efficiently with certain types of ambiguity, and our Logic & Computation colleagues are continuously refining the efficiency of theorem provers that computational semanticists can use to compute inferences.
However, in actual real-life applications, the use of the standard formalisms and algorithms of computational semantics is rare. A concrete example of this is the recent PASCAL Textual Entailment challenge, in which a computer system must determine whether one sentence "follows" from another. This is precisely the kind of task that classical computational semantics should be good at; but nonetheless, only a minority of systems even use classical semantic representations, and these mostly in nonstandard ways. In my opinion, this experience points towards a number of critical shortcomings of current approaches: We still don't know how to derive satisfying, useful, wide-coverage semantic representations (or even what "useful" really means); classical logical entailment is not a perfect approximation of natural language inference; and the knowledge bases needed for computing such inferences do not have sufficient coverage.
In this talk, I will outline the state of the art in computational semantics as I see it and point out several aspects that I believe work particularly well or not so well at this point. I will then speculate on some recent ideas -- such as learning world knowledge from corpora, grounding semantic information in the real world, and investigating alternative representation formalisms -- that might help alleviate these problems in the future. Rather than offering final answers to any of these questions, I hope to spark a debate that will bring computational semantics closer to applicability.

Alexander Koller

Alexander Koller received his PhD from Saarland University in Saarbruecken, Germany in 2004, and is now a DFG Research Fellow at Columbia University. In the past years, his research has focused on a variety of problems in computational semantics. For example, he and his colleagues defined the formalism of dominance graphs for the underspecified processing of scope ambiguities, developed efficient solvers for underspecified representations, and clarified the formal relationship between different underspecification formalisms. Recently, his focus has widened to the development of efficient algorithms for natural language processing.


Week 1: Thursday, 9 August

Speaker: Michael Witbrock

Title: Logic, Knowledge and Intelligence

Abstract: One aspect of Cyc is a very large, logic-based knowledge base, but it is more than that; the Cyc project is an attempt to move us towards general Artificial Intelligence by supporting automated reasoning about a very wide variety of real-world concerns. To support that goal, Cyc also encompasses, obviously enough, and inference engine able to reason over a large, contextual, knowledge base, but it also includes components for interpreting and producing natural language, acquiring knowledge and responding to user queries, and for interfacing with other software. In this talk, I'll talk about some of what we've done to apply logic to representation of general knowledge, at scale, and to use it in the production of (somewhat) intelligent behaviours, and discuss some ways in which we might move closer to artificial intelligence.



Dr. Michael Witbrock

Dr. Michael Witbrock serves as the Vice President for Research at Cycorp, Inc. and as CEO of Cycorp Europe. At Cycorp, he has overall responsibility for corporate research, and is particularly interested in automating the process of knowledge acquisition and elaboration, extending the range of knowledge representation and reasoning to mixed logical and probabilistic representations, and in validating and elaborating knowledge in the context of task performance, particularly in tasks that involve understanding text and communicating with users. Michael received his PhD in Computer Science from Carnegie Mellon in 1996 and a BSc in Psychology from Otago University, in New Zealand, in 1985. Prior to joining Cycorp, he was Principal Scientist at Terra Lycos, working on integrating statistical and knowledge based approaches to understanding web user behavior; a research scientist at Just Systems Pittsburgh Research Center, working on statistical text summarization; and a systems scientist at Carnegie Mellon on the Informedia spoken and video document information retrieval project, where he was also involved in the planning of the Experience on Demand Project. He is author of numerous publications in areas ranging across knowledge representation and acquisition, neural networks, parallel computer architecture, multimedia information retrieval, web browser design, genetic design, computational linguistics and speech recognition, and is the holder of four US patents.


Week 2: Tuesday, 14 August

Speaker: Ede Zimmermann

Title: Painting and Opacity

Abstract: Referentially opaque verbs like "seek", "owe", and "resemble" are known to (a) allow for an ambiguity between an ordinary (specific) and a peculiar unspecific reading of one of their nominal arguments; (b) defy truth-preserving substitution of co-extensional terms; and (c) block existential inferences: (a) one may seek/owe someone/resemble a horse without necessarily seeking/owing/resembling a particular animal; (b) even if all and only arctic horses are striped, one may seek/owe someone/resemble a striped horse without seeking/owing/resembling an arctic horse and (c) without there necessarily even being any striped (i.e. arctic) horses. Similarly, it would seem, one can paint, draw or imagine a striped horse without (a) portraying a particular animal, or (b) painting an arctic horse, or (c) there being any striped, or arctic, horses. As consequence of this analogy, verbs of depiction (like "paint", "draw", and "imagine") have often been taken to be opaque. In particular it has been suggested that the same mechanism that explain the anomalies (a)-(c) of opaque verbs, carry over to them. In this talk I will show that none of the known semantic analyses of opacity extends to verbs of depiction and present evidence that they are simply verbs of creation (as in "paint a picture") that invite a meaning shift of the nominal object as denoting representations of its ordinary denotations (e.g. horse pictures instead of horses).

Thomas Ede Zimmermann

Thomas Ede Zimmermann has been a professor of linguistics at the University of Frankfurt since 1999. He has previously taught at the universities of Konstanz, Tuebingen, and Stuttgart and held visiting positions at the University of Massachusetts at Amherst and at Rutgers; he has also given courses at various summer schools, including ESSLLI and NASSLI. His main area of expertise is the logical analysis of natural language meaning, about which he has written a number of articles published in major journals.


Week 2: Thursday, 16 August

Speaker: Ronan Reilly

Title: A model of grammar acquisition: the importance of complexity

Abstract: This talk will discuss some recent finding in the computer modelling of language acquisition. Contrary to the view enshrined in the poverty of the stimulus argument, I will demonstrate that a grammar as complex as that found in child-directed speech can be learnt by tuning into its statistical properties using a simple recurrent network (SRN). The SRN succeeds in learning a complex grammar and exhibits behaviours comparable to those found in child language development. This demonstrates that statistical information is sufficient to learn the syntactic structures and categories underlying language and that statistical learning is a feasible mechanism for children to employ. Surprisingly, the complexity of the grammar does not hinder performance but rather enables the acquisition of abstract grammatical structures that enhance the network's generalisation abilities.

Ronan Reilly

Ronan Reilly is Professor of Computer Science at NUI Maynooth. He has a background in computer science and psychology and his research interests are in the areas of vision and language, with a specific interest in their intersection in reading (cf. http://cortex.cs.nuim.ie).