THE TWO-VARIABLE REGRESSION MODEL:
(a) In the regression context, the method of least squares estimates the regression parameters in such a way that the sum of the squared difference between the actual Y values (i.e., the values of the dependent variable) and the estimated Y values is as small as possible. (b) The estimators of the regression parameters obtained by the method of least squares. (c) An estimator being a random variable, its variance, like the variance of any random variable, measures the spread of the estimated values around the mean value of the estimator. (d) The (positive) square root value of the variance of an estimator. (e) Equal variance.
(f) Unequal variance.
(g) Correlation between successive values of a random variable. (h) In the regression context, TSS is the sum of squared difference between the individual and the mean value of the dependent variable Y, namely, [pic]. (i) ESS is the part of the TSS that is explained by the explanatory variable(s). (j) RSS is the part of the TSS that is not explained by the explanatory variable(s), the X variable(s). (k) It measures the proportion of the total variation in Y explained by the explanatory variables. In short, it is the ratio of ESS to TSS. (l) It is the standard deviation of the Y values about the estimated regression line. (m) BLUE means best linear unbiased estimator, that is, a linear estimator that is unbiased and has the least variance in the class of all such linear unbiased estimators. (n) A statistical procedure of testing statistical hypotheses. (o) A test of significance based on the t distribution. (p) In a one-tailed test, the alternative hypothesis is one-sided. For example: [pic] against [pic] or [pic], where ( is the mean value. (q) In a two-tailed test, the alternative hypothesis is two-sided. (r) It is a short-hand for the statement: reject the null hypothesis. 7.2.
(a) False. It minimizes the sum of residuals squared, that is, it minimizes [pic]. (b) True.
(d) False. The OLS does not require any probabilistic assumption about the error term in estimating the parameters. (e) True. The OLS estimators are linear functions of [pic]and will follow the normal distribution if it is assumed that [pic] are normally distributed. Recall that any linear function of a normally distributed variable is itself normally distributed. (f) False. It is ESS / TSS.
(g) False. We should reject the null hypothesis.
(h) True. The numerator of both coefficients involves the covariance between Y and X, which can be positive or negative. (i) Uncertain. The p value is the exact level of significance of a computed test statistic, which may be different from an arbitrarily chosen level of significance, α . 7.3.
(c) 0 and 1
(d) -1 and +1
(g) the standard error of the estimate (h)[pic]
The answers to the missing numbers are in boxes:
[pic] = -66.1058 + 0.0650[pic] [pic]= 0.9460 se = (10.7509) ( 0.0035 ) n = 20 t = ( -6.1489 ) (18.73)
The critical t value at the 5% level for 18 d.f. is 2.101 (two-tailed) and 1.734 (one-tailed). Since the estimated t value of 18.73 far exceeds either of these critical values, we reject the null hypothesis. A two-tailed test is appropriate because no a priori theoretical considerations are known regarding the sign of the coefficient. 7.5.
[pic], following Equations (7.34) and (7.35)
In proving the last equality, note that [pic]. Then the result follows by substitution. 6. [pic]. See also Problem 6.22....
Please join StudyMode to read the full document