Regression Coefficient Standard Error
"Charley Tichenor" wrote...
>Could someone please provide the formula for how Excel
>calculates the regression coefficient standard error, or a
>similar formula for this calculation? Your time on this
>would be appreciated.
The model equation is Y = X b + e where Y is a 1-column N-row array of
observed values of the dependent variable, X is a k-column (possibly including a
column of ones for a constant term) N-row array of observations of multiple
independent/explanatory variables, b is a 1-column k-row array containing the
regression coefficients (b has as many rows as X has columns), and e is a
1-column N-row vector containing the random noise associated with each
Straight from the textbook, the estimator for the regression coefficients is
b_hat = (X' X)^-1 X' Y
The estimator for the variance of the noise (e) is
s^2 = (Y' Y - b_hat' X' Y) / (N - k)
The estimator for the variance-covariance matrix for the estimator of the
regression coefficients is
Var(b_hat) = s^2 (X' X)^-1
And the square roots of the diagonal entries in Var(b_hat) are the standard
errors of the respective regression coefficient estimators in b_hat.
There are many subtle ways to rearrange these calculations, and some involved
ways to minimize floating point rounding error. However, the formulas above
reproduce what Excel's LINEST function gives. If the Regression tool in the Data
Analysis Tool gives something different, I don't know how it's calculated.
So, with data in ranges named X and Y, standard error is given by
which gives the same result as
where the first column of the X array is all ones for the constant term, and X
ontains 2 other column variables for a total of 3 columns.
Public Service Announcement
Don't attach files to postings in nonbinary newsgroups like this one.