On stochastic gradient Langevin dynamics with dependent data streams: the fully non-convex case

Abstract

We consider the problem of sampling from a target distribution which is not necessarily logconcave. Non-asymptotic analysis results are established in a suitable Wasserstein-type distance of the Stochastic Gradient Langevin Dynamics (SGLD) algorithm, when the gradient is driven by even dependent data streams. Our estimates are sharper and uniform in the number of iterations, in contrast to those in previous studies.

Citation information

On stochastic gradient Langevin dynamics with dependent data streams: the fully non-convex case
Chau, N. H.; Moulines, É.; Rásonyi, M.; Sabanis, S.; Zhang, Y. 2018. In Print, SIAM Journal on Mathematics of Data Science. arXiv:1905.13142 [math.ST]

Turing affiliated authors