Reproducing Kernel Hilbert Spaces Approximation Bounds

03/28/2020
by   Ata Deniz Aydin, et al.
0

We find probability error bounds for approximations of functions f in a separable reproducing kernel Hilbert space H with reproducing kernel K on a base space X, firstly in terms of finite linear combinations of functions of type K_x_i and then in terms of the projection π^n_x on Span{K_x_i}^n_i=1, for random sequences of points x=(x_i)_i in the base space X. Previous results demonstrate that, for sequences of points (x_i)_i=1^∞ constituting a so-called uniqueness set, the orthogonal projections π^n_x to Span{K_x_i}^n_i=1 converge in the strong operator topology to the identity operator. The main result shows that, for a given probability measure P, letting P_K be the measure defined by d P_K(x)=K(x,x)d P(x), x∈ X, and H_P denote the reproducing kernel Hilbert space that is the operator range of the nonexpansive operator L^2(X;P_K)∋λ L_P,Kλ:=∫_X λ(x)K_xd P(x)∈H, where the integral exists in the Bochner sense, under the assumption that H_P is dense in H any sequence of points sampled independently from P yields a uniqueness set with probability 1. This result improves on previous error bounds in weaker norms, such as uniform or L^p norms, which yield only convergence in probability and not almost certain convergence. Two examples that show the applicability of this result to a uniform distribution on a compact interval and to the Hardy space H^2(D) are presented as well.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro