# Multiple Regression

**Topics:**Regression analysis, Least squares, Linear least squares

**Pages:**3 (340 words)

**Published:**December 12, 2011

(Mostly from Maddala)

The Ordinary Least Squares method of estimation can easily be extended to models involving two or more explanatory variables, though the algebra becomes progressively more complex. In fact, when dealing with the general regression problem with a large number of variables, we use matrix algebra, but that is beyond the scope of this course.

We illustrate the case of two explanatory variables, X1 and X2, with Y the dependant variable. We therefore have a model

Yi = α + 1X1i + 2X2i + ui

Where ui~N(0,σ2).

We look for estimators so as to minimise the sum of squared errors,

S =

Differentiating, and setting the partial differentials to zero we get

=0 (1)

=0(2)

=0(3)

These three equations are called the “normal equations”. They can be simplified as follows: Equation (1) can be written as

or

(4)

Where the bar over Y, X1 and X2 indicates sample mean. Equation (3) can be written as

Substituting in the value of from (4), we get

(5)

A similar equation results from (3) and (4). We can simplify this equation using the following notation. Let us define:

Equation (5) can then be written

S1Y = (6)

Similarly, equation (3) becomes

S2Y = (7)

We can solve these two equations to get:

and

Where =S11S22 – S122. We may therefore obtain from equation (4).

We can calculate the RSS, ESS and TSS from these estimators in the same way as for simple regression, that is

ESS=

TSS =

And, the coefficient of multiple determination is

R2 = ESS/TSS

That is, R2 is the proportion of the variation in Y explained by the regression.

The variances of our estimators are given by

and

Where r122 is the squared correlation coefficient between X1 and X2. Thus, the greater the correlation between the two explanatory variables, the greater the variance in the estimators, i.e. the...

Please join StudyMode to read the full document