Find mses2 and argue that its minimum over all c 0 will


Instructions: Answer each question on your own choice of paper, but be sure to staple (not paper clip) the sheets together. Point totals are in parentheses. Within a question, each part receives equal weight. Be sure to show how you arrive at any answer, or appeal to the appropriate results in the slides or lecture notes.

1. Let {Xi : i = 1,2, ... , n} be IID draws from the Normal(μ, σ2) distribution, and let S2 = (n - 1)-1 Σni=1(Xi - X‾)2.
(i) Define Zi = (Xi - μ)/σ and show that
                                                     (n - 1)S22 = Z'QnZ
where

Qn = In - jn(j'njn)-1 j'n =(1, 1,...,1), and z' = (z1, z2,..... zn).

[Hint: Write Z = σ-1 (X - μjn).]

(ii) Conclude that (n - 1)S22 ~ Χ2n-1

(iii) Now consider a class of estimators indexed by c ≥ 0:

σ~c2 = cS2.

Find the bias in as a function of c and σ2. Does it depend on μ?

(iv) Find Varθ~c2). Does it depend on μ? [Hint: Use part (ii) to find Var(S2).]

(v) Find MSEθ2), and argue that its minimum over all c ≥ 0 will not depend on σ2 (or μ).

(vi) Show that MSEθ~c2) is minimized at c* = (n - 1)/(n + 1), so the minimum MSE estimator shrinks S2 toward zero.

(vii) Agree or disagree with the following statement: "In cases where the MLE is not unbiased, at least it has the smallest mean squared error."

2. Let X > 0 represent an individual's willingness to pay for, say, a new public park. In the population, assume X has CDF
F(x; θ) = 1 - exp[-(2x/θ)1/2 ], x > 0
           = 0, x ≤ 0
where θ > 0. This is a special case of the Weibull distribution, and it can be shown that

E(X) = θ, Var(X) = 5θ2.

(i) Show that the PDF is
f(x; θ) = 1/(√θ √2x). exp[-(2x/θ)1/2], x > 0.
(ii) Write down the log likelihood for a random draw i, li(0), and find its derivative.
(iii) Show that E[(2X)1/2 = by showing

E[ (2X)1/2 = 1/√θ 0[1 - F(x; θ)]dx,

where 1 - F(x; θ) is the survival function. [Review how to compute E(X) from the survival function.] Then use this to show

Eθ[∂li(θ)/∂θ] = 0.

(iv) Show that the MLE of θ is

θ^ = [n-1Σni=1(2Xi)1/2]2

(v) Conclude that θ^ has an upward bias. Is θ^ consistent?

(vi) Show that AVar[√n (θ^ - θ) ] = 4θ2. [Hint: Find the expected value of the second derivative of the log likelihood and use the general result for MLE.]

(vii) We can also use X‾, the sample average, to estimate 0. What is AVar[√n(X‾ -θ)]? How come you knew it would be larger than the asymptotic variance of the MLE?

3. Suppose X - Lognormal(μ, σ2), so that, μ = E[log(X)], σ2 = Var[log(X)].

(i) Given a random sample of size n, explain why the MLEs are

μ^ = n-1 Σni=1 Yi

σ^2 = n-1 Σni=1 (Yi - μ^)2 

where Yi, = log(Xi).

(ii) Show that

2079_figure.png

(iii) Consider estimating γ = E(X). Let γ~ = X‾ be the sample average of the Xi. What is AVai[√n(γ‾ - γ)?

[Hint: You might want to look at the variance of the lognormal distribution.]

(iv) Define γ^ = exp(μ^ + σ2/2). How do you know AVar[√n (γ^ - y)] is smaller than AVar[√n(γ‾ - γ)] without doing any algebra?

(v) Use the delta method to find AVar[√n(γ‾ - γ)] and show that it is in fact smaller than AVar[√n(γ^ - γ].

[Hint: It will be useful to remember that, by the infinite series representation of exp(θ), exp(θ) > 1 + θ + θ2/2 for θ > 0]

(vi) In defining γ^, does it matter for the previous conclusions whether σ2 is the MLE or the unbiased sample variance? Explain.

Request for Solution File

Ask an Expert for Answer!!
Applied Statistics: Find mses2 and argue that its minimum over all c 0 will
Reference No:- TGS02541998

Expected delivery within 24 Hours