skip to main content

CMX Lunch Seminar

Wednesday, April 26, 2023
12:00pm to 1:00pm
Add to Cal
Annenberg 213
Regularized SVGD, Gaussian Variational Inference and Wasserstein Gradient Flows
Krishnakumar Balasubramanian, Assistant Professor, Department of Statisticis, University of California Davis,

Stein Variational Gradient Descent (SVGD) and Gaussian Variational Inference (GVI) are two optimization-inspired, deterministic, particle-based algorithms for sampling from a given target density. They are considered as alternatives to the more classical randomized MCMC methods. Both approaches could be thought of as approximate but practically implementable discretizations of the Wasserstein Gradient Flow (WGF)corresponding to the KL-divergence minimization. However, theoretical properties of the SVGD and GVI method, and comparisons to more standard MCMC methods are largely unknown.

Mean-field analysis of SVGD reveals that the gradient flow corresponding to the SVGD algorithm (i.e., the Stein Variational Gradient Flow (SVGF)) only provides a constant-order approximation to the WGF. In the first part of the talk, I will introduce a Regularized-SVGF (R-SVGF) which interpolates between the SVGF and the WGF. I will discuss various theoretical properties of the R-SVGF and its time-discretization, including convergence to equilibrium, existence and uniqueness of weak solutions, and stability of the solutions under various assumptions on the target density. We provide preliminary numerical evidence of the improved performance offered by the regularization.

GVI provides parametric approximations to the WGF by restricting the flow to the Bures–Wasserstein (BW) space of Gaussians endowed with the Wasserstein distance. In the second part of the talk, I will introduce the (Stochastic) Forward-Backward GVI (FB–GVI) algorithm for implementing GVI. This algorithm exploits the composite structure of the KL divergence, which can be written as the sum of a smooth term (the potential) and a non-smooth term (the entropy) over the BW space. For the FB–GVI algorithm, I will discuss state-of-the-art convergence guarantees when the target density is log-smooth and log-concave, as well as the first convergence guarantees to first-order stationary solutions when the target density is only log-smooth.

For more information, please contact Jolene Brink by phone at (626)395-2813 or by email at [email protected] or visit CMX Website.