Jonathan Schwarz |

Tue 09 Apr 2019, 11:00 - 12:00 |

IF 4.31/4.33 |

If you have a question about this talk, please contact: Gareth Beedham (gbeedham)

Probabilistic approaches to Meta- and Continual-Learning

Jonathan Schwarz, Google DeepMind

https://jonathan-schwarz.github.io/

Much of the recent progress in machine learning has been fuelled by the growth in amount and diversity of available data. This begs the question of whether machine learning systems necessarily need large amounts of data to solve a task well. In this talk, I will introduce an exciting recent development that aims to learn from distribution over tasks, each with comparably small amounts of data. By exploiting structure between such related problems, algorithms learn to explain the heterogeneity in this distribution from a small number of samples, allowing for data-efficient learning on a new problem. When such a learning algorithm is confronted with a new task, inherent uncertainty exists about the properties of the problem at hand. Thus, instead of learning a fixed-parameter model, it is imperative to instead model the full set of hypotheses that explain the observed data at hand. As an example, in recommender systems each user can be viewed as a regression task with a small associated data set and the goal is to learn a “prior” from similar users such that a system can rapidly recommend relevant items.

In this talk, I will describe a number of recent projects I have been involved in, each introducing probabilistic methods that allow us to leverage information from related problems. As this is a very general paradigm, we will cover diverse problems ranging from sequential classification to image inpainting and decision making problems such as Bayesian Optimisation and contextual bandits.