Minimax rate of estimation for the stationary distribution of jump-processes over anisotropic Holder classes

11/24/2020
by   Chiara Amorino, et al.
0

We study the problem of the non-parametric estimation for the density of the stationary distribution of the multivariate stochastic differential equation with jumps (Xt) , when the dimension d is bigger than 3. From the continuous observation of the sampling path on [0, T ] we show that, under anisotropic Holder smoothness constraints, kernel based estimators can achieve fast convergence rates. In particular , they are as fast as the ones found by Dalalyan and Reiss [9] for the estimation of the invariant density in the case without jumps under isotropic Holder smoothness constraints. Moreover, they are faster than the ones found by Strauch [29] for the invariant density estimation of continuous stochastic differential equations, under anisotropic Holder smoothness constraints. Furthermore, we obtain a minimax lower bound on the L2-risk for pointwise estimation, with the same rate up to a log(T) term. It implies that, on a class of diffusions whose invariant density belongs to the anisotropic Holder class we are considering, it is impossible to find an estimator with a rate of estimation faster than the one we propose.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro