Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
470 views
in Technique[技术] by (71.8m points)

regression - How to calculate variance of least squares estimator using QR decomposition in R?

I'm trying to learn QR decomposition, but can't figure out how to get the variance of beta_hat without resorting to traditional matrix calculations. I'm practising with the iris data set, and here's what I have so far:

y<-(iris$Sepal.Length)
x<-(iris$Sepal.Width)
X<-cbind(1,x)
n<-nrow(X)
p<-ncol(X)
qr.X<-qr(X)
b<-(t(qr.Q(qr.X)) %*% y)[1:p]
R<-qr.R(qr.X)
beta<-as.vector(backsolve(R,b))
res<-as.vector(y-X %*% beta)

Thanks for your help!

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

setup (copying in your code)

y <- iris$Sepal.Length
x <- iris$Sepal.Width
X <- cbind(1,x)
n <- nrow(X)
p <- ncol(X)
qr.X <- qr(X)
b <- (t(qr.Q(qr.X)) %*% y)[1:p]  ## can be optimized; see Remark 1 below
R <- qr.R(qr.X)  ## can be optimized; see Remark 2 below
beta <- as.vector(backsolve(R, b))
res <- as.vector(y - X %*% beta)

math

enter image description here

computation

Residual degree of freedom is n - p, so estimated variance is

se2 <- sum(res ^ 2) / (n - p)

Thus, the variance covariance matrix of estimated coefficients is

V <- chol2inv(R) * se2

#           [,1]         [,2]
#[1,]  0.22934170 -0.07352916
#[2,] -0.07352916  0.02405009

validation

Let's check the correctness by comparing with lm:

fit <- lm(Sepal.Length ~ Sepal.Width, iris)

vcov(fit)

#            (Intercept) Sepal.Width
#(Intercept)  0.22934170 -0.07352916
#Sepal.Width -0.07352916  0.02405009

Identical result!


Remark 1 (skip forming 'Q' factor)

Instead of b <- (t(qr.Q(qr.X)) %*% y)[1:p], you can use function qr.qty (to avoid forming 'Q' matrix):

b <- qr.qty(qr.X, y)[1:p]

Remark 2 (skip forming 'R' factor)

You don't have to extract R <- qr.R(qr.X) for backsolve; using qr.X$qr is sufficient:

beta <- as.vector(backsolve(qr.X$qr, b))

Appendix: A function for estimation

The above is the simplest demonstration. In practice column pivoting and rank-deficiency need be dealt with. The following is an implementation. X is a model matrix and y is the response. Results should be compared with lm(y ~ X + 0).

qr_estimation <- function (X, y) {
  ## QR factorization
  QR <- qr(X)
  r <- QR$rank
  piv <- QR$pivot[1:r]
  ## estimate identifiable coefficients
  b <- qr.qty(QR, y)[1:r]
  beta <- backsolve(QR$qr, b, r)
  ## fitted values
  yhat <- base::c(X[, piv] %*% beta)
  ## residuals
  resi <- y - yhat
  ## error variance
  se2 <- base::c(crossprod(resi)) / (nrow(X) - r)
  ## variance-covariance for coefficients
  V <- chol2inv(QR$qr, r) * se2
  ## post-processing on pivoting and rank-deficiency
  p <- ncol(X)
  beta_full <- rep.int(NA_real_, p)
  beta_full[piv] <- beta
  V_full <- matrix(NA_real_, p, p)
  V_full[piv, piv] <- V
  ## return
  list(coefficients = beta_full, vcov = V_full,
       fitted.values = yhat, residuals = resi, sig = sqrt(se2))
  }

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...