Profile least squares estimators in the monotone single index model

01/15/2020
by   Fadoua Balabdaoui, et al.
0

We consider least squares estimators of the finite dimensional regression parameter α in the single index regression model Y=ψ(α^TX)+ϵ, where X is a d-dimensional random vector, E(Y|X)=ψ(α^TX), and where ψ is monotone. It has been suggested to estimate α by a profile least squares estimator, minimizing ∑_i=1^n(Y_i-ψ(α^TX_i))^2 over monotone ψ and α on the boundary S_d-1of the unit ball. Although this suggestion has been around for a long time, it is still unknown whether the estimate is √(n) convergent. We show that a profile least squares estimator, using the same pointwise least squares estimator for fixed α, but using a different global sum of squares, is √(n)-convergent and asymptotically normal. The difference between the corresponding loss functions is studied and also a comparison with other methods is given. An augmented Lagrange method, embedded in the Hooke-Jeeves pattern search algorithm, is implemented in R to compute the profile least squares estimators.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro