Alexander Koller
Fri 24 Aug 2018, 11:00 - 12:30
Informatics Forum (IF-4.31/4.33)

If you have a question about this talk, please contact: Diana Dalla Costa (ddallac)


Alexander Koller, Saarland University and Facebook AI Research Joint work with Jonas Groschwitz, Meaghan Fowlie, Matthias Lindemann, and Mark Johnson.

Much recent research on semantic parsing has focused on learning to map natural-language sentences to graphs which represent the meaning of the sentence, such as Abstract Meaning Representations (AMRs) and MRS graphs. In this talk, I will discuss methods for semantic parsing into graphs which aim to make the compositional structure of the semantic representations explicit. This connects semantic parsing to a fundamental principle of linguistic semantics and should improve generalization to unseen data, improving accuracy.

I will first introduce two graph algebras - the HR algebra from the theory literature and our own apply-modify (AM) algebra -, and show how to define symbolic grammars that map between strings and graphs using these algebras. Compared to the HR algebra, the AM algebra drastically reduces the number of possible compositional structures for a given graph, but it still permits linguistically plausible analyses for a variety of nontrivial semantic phenomena.

I will then report on a neural semantic parser which learns to map sentences into terms over the AM algebra. This semantic parser combines a neural supertagger (which predicts elementary graphs for each word in the sentence) with a neural dependency parser (which predicts the structure of the AM terms). By constraining the search to AM terms which also satisfy certain simple type constraints, we achieve state-of-the-art (pre-ACL) accuracy in AMR parsing. One advantage of the model is that it generalizes neatly to other semantic parsing problems, such as semantic parsing into MRS or DRT.


Alexander Koller is a Professor of Computational Linguistics in the Department of Language Science and Technology at Saarland University. His research interested span a broad range of topics in computational linguistics, including semantics, parsing, and dialogue. After spending his academic youth with grammar-based approaches, he is currently utilizing a sabbatical at Facebook AI Research in Paris to recalibrate his research methods towards deep learning. Alexander got his PhD from Saarland University and did a postdoc at Columbia University and the University of Edinburgh, and he is looking forward to reconnecting with everyone at Edinburgh.