Rate of convergence of the smoothed empirical Wasserstein distance

05/04/2022
āˆ™
by   Adam Block, et al.
āˆ™
0
āˆ™

Consider an empirical measure ā„™_n induced by n iid samples from a d-dimensional K-subgaussian distribution ā„™ and let γ = š’©(0,σ^2 I_d) be the isotropic Gaussian measure. We study the speed of convergence of the smoothed Wasserstein distance W_2(ā„™_n * γ, ā„™*γ) = n^-α + o(1) with * being the convolution of measures. For K<σ and in any dimension d≄ 1 we show that α = 12. For K>σ in dimension d=1 we show that the rate is slower and is given by α = (σ^2 + K^2)^2 4 (σ^4 + K^4) < 1/2. This resolves several open problems in <cit.>, and in particular precisely identifies the amount of smoothing σ needed to obtain a parametric rate. In addition, we also establish that D_KL(ā„™_n * γℙ*γ) has rate O(1/n) for K<σ but only slows down to O((log n)^d+1 n) for K>σ. The surprising difference of the behavior of W_2^2 and KL implies the failure of T_2-transportation inequality when σ < K. Consequently, the requirement K<σ is necessary for validity of the log-Sobolev inequality (LSI) for the Gaussian mixture ā„™ * š’©(0, σ^2), closing an open problem in <cit.>, who established the LSI under precisely this condition.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro