Improved Stein Variational Gradient Descent with Importance Weights

10/02/2022
by   Lukang Sun, et al.
0

Stein Variational Gradient Descent (SVGD) is a popular sampling algorithm used in various machine learning tasks. It is well known that SVGD arises from a discretization of the kernelized gradient flow of the Kullback-Leibler divergence D_KL(·|π), where π is the target distribution. In this work, we propose to enhance SVGD via the introduction of importance weights, which leads to a new method for which we coin the name β-SVGD. In the continuous time and infinite particles regime, the time for this flow to converge to the equilibrium distribution π, quantified by the Stein Fisher information, depends on ρ_0 and π very weakly. This is very different from the kernelized gradient flow of Kullback-Leibler divergence, whose time complexity depends on D_KL(ρ_0|π). Under certain assumptions, we provide a descent lemma for the population limit β-SVGD, which covers the descent lemma for the population limit SVGD when β→ 0. We also illustrate the advantages of β-SVGD over SVGD by simple experiments.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro