Presenter: Sam Livingstone
Recent advances in gradient-based Markov chain Monte Carlo
Markov chain Monte Carlo (MCMC) is a powerful tool for computing
expectations of unnormalized probability measures, that is widely used
in many fields such as Bayesian statistics and statistical physics.
When the measure of interest has a smooth density, state-of-the-art
MCMC algorithms rely on gradient information. I will review some
popular gradient-based algorithms for MCMC such as MALA and
Hamiltonian Monte Carlo, discuss what is known theoretically about the
mixing times of the resulting Markov chains, and then introduce some
recent extensions, along with alternative gradient-based schemes with
enhanced robustness properties according to various criteria.
Samuel Livingstone is an associate professor in the department of
statistical science, University College London. His main research
focus is to develop and study principled Markov chain Monte Carlo
algorithms for Bayesian inference. He has dedicated much effort to
developing a better understanding of gradient-based algorithms in
continuous spaces, such as Langevin dynamics and Hamiltonian Monte
Carlo, and has also worked on well-known problems in discrete state
spaces such as Bayesian variable selection. He also has an interest
in applications, in particular in health data science.
If you would like to attend, please email email@example.com for the Zoom details.