Percy Liang
Fri 13 May 2016, 11:30 - 13:00
Informatics Forum (IF-4.31/4.33)

If you have a question about this talk, please contact: Diana Dalla Costa (ddallac)

Abstract:

Can we learn if we start with zero examples, either labeled or unlabeled?  This scenario arises in new user-facing systems (such as virtual assistants for new domains), where inputs should come from users, but no users exist until we have a working system, which depends on having training data.  I discuss recent work that circumvent this circular dependence by interleaving user interaction and learning.

Biography: 

Percy Liang is an Assistant Professor of Computer Science at Stanford University (B.S. from MIT, 2004; Ph.D. from UC Berkeley, 2011).  His research interests include modeling natural language semantics and developing machine learning methods that infer rich latent structures from limited supervision.  His awards include the IJCAI Computers and Thought Award (2016), an NSF CAREER Award (2016), a Sloan Research Fellowship (2015), a Microsoft Research Faculty Fellowshop (2014), and the best student paper at the International Conference on Machine Learning (2008).