Householder Meets Student

06/11/2022
by   John H. Elton, et al.
0

The Householder algorithm for the QR factorization of a tall thin n x p full-rank matrix X has the added bonus of producing a matrix M with orthonormal columns that are a basis for the orthocomplement of the column space of X. We give a simple formula for M-Transpose x when x is in that orthocomplement. The formula does not require computing M, it only requires the R factor of a QR factorization. This is used to get a remarkably simple computable concrete representation of independent "residuals" in classical linear regression. For Students problem, when p=1, if R(j)=Y(j)-Ybar are the usual (non-independent) residuals, W(j)=R(j+1) - R(1)/(sqrt(n)+1) gives n-1 i.i.d. mean-zero normal variables whose sum of squares is the same as that of the n residuals. Those properties of this formula can (in hindsight) easily be verified directly, yielding a new simple and concrete proof of Student's theorem. It also gives a simple way of generating n-1 exactly mean-zero i.i.d. samples from n samples with unknown mean. Yiping Cheng exhibited concrete linear combinations of the Y(j) with these properties, in the context of a constructive proof of Student's theorem, but that representation is not so simple. Analogous simple results are obtained for regression when there are more predictors, giving a very simple computable concrete formula for n-p i.i.d. independent residuals with the same sum of squares as that of the usual n non-independent residuals. A connection with Cochran's theorem is discussed.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro