Conor Durkan and Stanislaw Jastrzebski
Tue 23 Oct 2018, 11:00 - 12:00
IF 4.31/4.33

If you have a question about this talk, please contact: Gareth Beedham (gbeedham)

Conor Durkan

Title: Sequential Neural Methods for Likelihood-free Inference

Abstract: Likelihood-free inference arises when the likelihood function is intractable and cannot explicitly be evaluated, which is often the case with parametrized simulator models. Typically approached using the sample-based methods of approximate Bayesian computation, recent work has demonstrated the effectiveness of neural conditional density estimators in learning an approximate posterior or surrogate likelihood in a sequential manner, yielding state of the art results. We discuss these advances and the machine learning tools they rely upon, and attempt to draw fair comparison between the methods.

 

Stanislaw Jastrzębski

Title: Understanding the early phase of training of neural networks

Abstract: There is a growing interest in understanding how and under what conditions neural networks generalize well. In this talk we will focus on a yet largely unexplored approach towards answering this question - analysing the early phase of training of neural networs. The main goal of the talk is to provide a survey of the recent work which points towards the critical importance of this early phase for the final generalization performance.