The last lessons have spent a lot of time describing the


1) The last lessons have spent a lot of time describing the slope and intercept terms (and their variances) of the one-variable sample regression function. We also know that for any particular value of the independent variable (call it X0), that the predicted value of Y0 is Yo^ = β0^ + β1^Xo. (This is sometimes called a "point prediction.")

a) Prove that  Yo^ is an unbiased estimator of E[Y0|X0].

b) Derive the formula for the variance of Yo^. Show at least two steps in this derivation.

a. Hint 1: You are looking for Var(Yo^) = Var(β0^ + β1^Xo). This is the variance of a sum of two random variables. What is the general formula for such a sum? (Go back to week 2 lectures, if you need a reminder.) Use that formula now.

b. Hint 2: If you did hint 1 correctly, you will see you need the formula for cov(β0^, β1^). Take it on faith that this can be found to be cov(β0^, β1^) = -X-var(β1^). (You might find it interesting that the two estimators have a negative correlation. A steeper slope tends to imply a lower intercept, and vice versa.)

2) Compare the following two regressions:

i. Yi = β0^ + β1^Xi + ei
ii. Yi = β0^ + β1^(2Xi) + ei

Equation i. is exactly the regression we've been working with thus far, so all the formulas we've derived thus far apply. In equation ii the independent variable has been multiplied by 2. How does this change, if at all, the values of βo^, β1^, Sβ12, R2, and SSE?

Solution Preview :

Prepared by a verified Expert
Microeconomics: The last lessons have spent a lot of time describing the
Reference No:- TGS01258015

Now Priced at $10 (50% Discount)

Recommended (93%)

Rated (4.5/5)