Second order Poincaré inequalities and de-biasing arbitrary convex regularizers when p/n → γ

12/26/2019
by   Pierre C. Bellec, et al.
0

A new Central Limit Theorem (CLT) is developed for random variables of the form ξ=z^ f(z) - div f(z) where z∼ N(0,I_n). The normal approximation is proved to hold when the squared norm of f(z) dominates the squared Frobenius norm of ∇ f(z) in expectation. Applications of this CLT are given for the asymptotic normality of de-biased estimators in linear regression with correlated design and convex penalty in the regime p/n→γ∈ (0,∞). For the estimation of linear functions 〈 a_0,β〉 of the unknown coefficient vector β, this analysis leads to asymptotic normality of the de-biased estimate for most normalized directions a_0, where "most" is quantified in a precise sense. This asymptotic normality holds for any coercive convex penalty if γ<1 and for any strongly convex penalty if γ> 1. In particular the penalty needs not be separable or permutation invariant. For the group Lasso, a simple condition is given that grants asymptotic normality for a fixed direction a_0. For the lasso, this condition reduces to λ^2Σ^-1a_0_1^2/R̅→0 where R̅ is the noiseless prediction risk.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro