Alain Durmus (ENS Paris-Saclay)
Wed 16 Jan 2019, 16:00 - 17:00
JCMB 5323

If you have a question about this talk, please contact: Kostas Zygalakis (kzygalak)

We provide new insights on the Unadjusted Langevin Algorithm. We show that this method can be formulated as a first order optimization algorithm of an objective functional defined on the Wasserstein space of order 2. Using this interpretation and techniques borrowed from convex optimization, we give a non-asymptotic analysis of this method to sample from log concave smooth target distribution. Our proofs are then easily extended to the Stochastic Gradient Langevin Dynamics, which is a popular extension of the Unadjusted Langevin Algorithm. Finally, this interpretation leads to a new methodology to sample from a non-smooth target distribution, for which a similar study is done.