Matrix Formulation of Multiple Regression
Y: Vector of Criterion Scores
Dimensions of Y -> (n,1)
X: Augmented Raw Score Matrix
Dimensions of X -> (n,k+1)
b: Vector of Regression Coefficients
Dimensions of b -> (k+1,1)
Dimensions of e -> (n,1)
Computing b
It is relatively easy to derive this:
Y = Xb + e [The matrix form of the regression equation]
X'Y = X'(Xb + e) [Multiply each side by X']
X'Y = X'Xb + X'e [Simplify]
It can be shown that X'e is always 0 because the residuals are independent of the predictor variables, thus:
X'Y = X'Xb
(X'X)-1X'Y = (X'X)-1X'Xb [Now multiply both sides by (X'X)-1]
(X'X)-1X'Y = Ib [Since (X'X)-1X'X = I]
(X'X)-1X'Y = b [Simplify]
Computing Predicted Score
Note: Do not use the augmented X; x's and y's must be in deviation score form.
Regression Sum of Squares
Covariance Matrix of Regression Standard Errors
The square roots of the diagonals of C are the standard errors of the regression coefficients.
Standardized Regression Coefficients
Multivariate Course Page
Phil Ender, 3Oct07, 30Jun98