George Papamakarios and Iain Murray
Tue 31 Jan 2017, 11:00 - 12:00
IF 4.31/4.33

If you have a question about this talk, please contact: Gareth Beedham (gbeedham)

George Papamakarios

Title:

Inference as Learning

Abstract:

How can we do Bayesian inference if the likelihood is not available? This situation arises in simulator-based models, where the model can be easily simulated but its likelihood is intractable. One can learn to perform inference in such models based only on simulation data, by casting inference as a learning problem. In this talk, I will describe a strategy for doing this efficiently using Bayesian conditional density estimation, and compare it with established likelihood-free inference techniques such as Approximate Bayesian Computation.

 

 

Matthew Graham

Title:

Inference in differentiable generative models

Abstract:

Many generative models can be expressed as a differentiable function of random inputs drawn from some simple probability density. This framework includes both models specified by deep network architectures such as the generators of Variational Autoencoders and Generative Adversarial Networks, and a large class of simulator models defined procedurally where we do not have a closed form for the likelihood. In this talk I will present a method for performing efficient MCMC inference in such models when conditioning on observations of the model output. For some models this offers an asymptotically exact inference method where Approximate Bayesian Computation might otherwise be employed. I will attempt to visually motivate the method and show how leveraging gradient information can give significantly improved efficiency over alternative methods.