How to calculate the Standard error from a Variance-covariance matrix?

Laurence_jj picture Laurence_jj · Sep 11, 2018 · Viewed 11.5k times · Source

I am calculating a variance-covariance matrix and I see two different ways of calculating the standard errors:

  • sqrt(diagonal values/number of observations)

e.g. standard deviation / sqrt(number of observations)

(as is given from on how to calculate the standard error https://en.wikipedia.org/wiki/Standard_error)

or some people say it is simply

  • sqrt(diagonal values)

I had previously thought that the diagonal values in the variance-co-variance matrix were the variance and hence the square root would be the standard deviation (not the SE). However, the more I read the more I think I may be wrong and that it is the SE, but I am unsure why this is the case.

Can anyone help? Many thanks!!

Answer

Tobi picture Tobi · Sep 11, 2018

Yes, the diagonal elements of the covariance matrix are the variances. The square root of these variances are the standard deviations. If you need the standard error you have to clarify the question "the standard error of what?" (see also the wikipedia entry of your post). If you mean the standard error of the mean then yes, "standard deviation / sqrt(number of observations)" is what you are looking for.