Kody Law (Manchester) |

Wed 30 Jan 2019, 16:00 - 17:00 |

JCMB 5323 |

If you have a question about this talk, please contact: Kostas Zygalakis (kzygalak)

Bayesian inference provides a principled and well-defined approach to the integration of data into an a priori known distribution, resulting in a posterior distribution. This talk will concern the problem of inference when the posterior measure involves continuous models which require approximation before inference can be performed. Typically one cannot sample from the posterior distribution directly, but can at best only evaluate it, up to a normalizing constant. Therefore one must resort to computationally-intensive inference algorithms in order to construct estimators. These algorithms are typically of Monte Carlo type, and include for example Markov chain Monte Carlo, importance samplers, and sequential Monte Carlo samplers. The multilevel Monte Carlo method provides a way of optimally balancing discretization and sampling error on a hierarchy of approximation levels, such that cost is optimized. Recently this method has been applied to computationally intensive inference. This non-trivial task can be achieved in a variety of ways. This talk will review 3 primary strategies which have been successfully employed to achieve optimal (or canonical) convergence rates – in other words faster convergence than i.i.d. sampling at the finest discretization level. Some of the specific resulting algorithms, and applications, will also be presented.