Stochastic Modified Flows for Riemannian Stochastic Gradient Descent

Benjamin Gess, Sebastian Kassing, Nimit Rana

Research output: Contribution to journalArticlepeer-review

Abstract

We give quantitative estimates for the rate of convergence of Riemannian stochastic gradient descent (RSGD) to Riemannian gradient flow and to a diffusion process, the so-called Riemannian stochastic modified flow (RSMF). Using tools from stochastic differential geometry, we show that, in the small learning rate regime, RSGD can be approximated by the solution to the RSMF driven by an infinite-dimensional Wiener process. The RSMF accounts for the random fluctuations of RSGD and, thereby, increases the order of approximation compared to the deterministic Riemannian gradient flow. The RSGD is built using the concept of a retraction map, that is, a cost-efficient approximation of the exponential map, and we prove quantitative bounds for the weak error of the diffusion approximation under assumptions on the retraction map, the geometry of the manifold, and the random estimators of the gradient.
Original languageEnglish
Pages (from-to)3288-3314
JournalSIAM Journal on Control and Optimization
Volume62
Issue number6
DOIs
Publication statusPublished - 13 Dec 2024

Bibliographical note

This is an author-produced version of the published paper. Uploaded in accordance with the University’s Research Publications and Open Access policy.

Cite this