I know there is a small difference between $sigma
and the concept of root mean squared error. So, i am wondering what is the easiest way to obtain RMSE out of lm
function in R?
res<-lm(randomData$price ~randomData$carat+
randomData$cut+randomData$color+
randomData$clarity+randomData$depth+
randomData$table+randomData$x+
randomData$y+randomData$z)
length(coefficients(res))
contains 24 coefficient, and I cannot make my model manually anymore.
So, how can I evaluate the RMSE based on coefficients derived from lm
?
Residual sum of squares:
RSS <- c(crossprod(res$residuals))
Mean squared error:
MSE <- RSS / length(res$residuals)
Root MSE:
RMSE <- sqrt(MSE)
Pearson estimated residual variance (as returned by summary.lm
):
sig2 <- RSS / res$df.residual
Statistically, MSE is the maximum likelihood estimator of residual variance, but is biased (downward). The Pearson one is the restricted maximum likelihood estimator of residual variance, which is unbiased.
Remark
x
and y
, c(crossprod(x, y))
is equivalent to sum(x * y)
but much faster. c(crossprod(x))
is likewise faster than sum(x ^ 2)
.sum(x) / length(x)
is also faster than mean(x)
.