Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Suppose that a single observation X is taken from the normal distribution with mean 0 and unknown standard deviation σ > 0. Find an unbiased estimator, determine its variance, and show that this variance is more significant than for every value σ > 0. Note that the value of I(σ) was found in Exercise 4.

Short Answer

Expert verified

It has been shown that any other unbiased estimator’s variance has a higher value.

Step by step solution

01

Given the information

It is given that X is a random variable that follows Normal distribution with its mean of 0 and unknown standard deviation >0. Therefore X1,….,Xn are ii d Normal (µ=0, σ > 0)

02

 Step 2: Define the pdf

f(x| µ = 0, σ ) = 1/√ 2πσ2) exp(-x /2σ)2)

03

Define fisher information

Assume X~ f (x| θ) (pdf or pmf) with θ ∈ ʘ ⊂ R

Then the fisher information is defined by

Ix(θ) = Eθ[(∂ / ∂θ logf(X|θ))2]

= Eθ[(-∂2 / ∂θ logf(X|θ)]

and

Ix(θ) = nlx1 (θ)

04

Calculating fisher information for normal distribution

Let X =X1

From the definition

Ix(σ) = Eθ[(∂ / (∂σ2)2 logf(X|σ2)]

= -3(x-µ)2/ σ4 + 1/σ2

= 2/ σ2

And we know that

Ix2) = nIX12)

= 2n/ σ2

Therefore, the fisher information is 2n/ σ2

Therefore 1/l (σ) = σ2/ 2n

By the CRLB theorem, the variance obtained by fisher information has the minor variance. Therefore, the variance of any other unbiased estimator will always be higher than the one obtained by CRLB bound.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Question:Suppose that a random variable X has the geometric distribution with an unknown parameter p. (See Sec. 5.5.).Find a statistic \({\bf{\delta }}\left( {\bf{X}} \right)\)that will be an unbiased estimator of\(\frac{{\bf{1}}}{{\bf{p}}}\).

In Example 8.2.3, suppose that we will observe n = 20 cheese chunks with lactic acid concentrations \({X_1},...,{X_{20}}\) . Find a number c so that \(P\left( {{{\bar X}_{20}} \le \mu + c\sigma '} \right) = 0.95\)

Question:Suppose that a random variable X can take only the five values\({\bf{x = 1,2,3,4,5}}\) with the following probabilities:

\(\begin{aligned}{}{\bf{f}}\left( {{\bf{1}}\left| {\bf{\theta }} \right.} \right){\bf{ = }}{{\bf{\theta }}^{\bf{3}}}{\bf{,}}\,\,\,\,{\bf{f}}\left( {{\bf{2}}\left| {\bf{\theta }} \right.} \right){\bf{ = }}{{\bf{\theta }}^{\bf{2}}}\left( {{\bf{1}} - {\bf{\theta }}} \right){\bf{,}}\\{\bf{f}}\left( {{\bf{3}}\left| {\bf{\theta }} \right.} \right){\bf{ = 2\theta }}\left( {{\bf{1}} - {\bf{\theta }}} \right){\bf{,}}\,\,\,{\bf{f}}\left( {{\bf{4}}\left| {\bf{\theta }} \right.} \right){\bf{ = \theta }}{\left( {{\bf{1}} - {\bf{\theta }}} \right)^{\bf{2}}}{\bf{,}}\\{\bf{f}}\left( {{\bf{5}}\left| {\bf{\theta }} \right.} \right){\bf{ = }}{\left( {{\bf{1}} - {\bf{\theta }}} \right)^{\bf{3}}}{\bf{.}}\end{aligned}\)

Here, the value of the parameter θ is unknown (0 ≤ θ ≤ 1).

a. Verify that the sum of the five given probabilities is 1 for every value of θ.

b. Consider an estimator δc(X) that has the following form:

\(\begin{aligned}{}{{\bf{\delta }}_{\bf{c}}}\left( {\bf{1}} \right){\bf{ = 1,}}\,\,{{\bf{\delta }}_{\bf{c}}}\left( {\bf{2}} \right){\bf{ = 2}} - {\bf{2c,}}\,\,{{\bf{\delta }}_{\bf{c}}}\left( {\bf{3}} \right){\bf{ = c,}}\\{{\bf{\delta }}_{\bf{c}}}\left( {\bf{4}} \right){\bf{ = 1}} - {\bf{2c,}}\,\,{{\bf{\delta }}_{\bf{c}}}\left( {\bf{5}} \right){\bf{ = 0}}{\bf{.}}\end{aligned}\)

Show that for each constant, c\({{\bf{\delta }}_{\bf{c}}}\left( {\bf{X}} \right)\)is an unbiased estimator of θ.

c. Let\({{\bf{\theta }}_{\bf{0}}}\)be a number such that\({\bf{0 < }}{{\bf{\theta }}_{\bf{0}}}{\bf{ < 1}}\). Determine a constant\({{\bf{c}}_{\bf{0}}}\)such that when\({\bf{\theta = }}{{\bf{\theta }}_{\bf{0}}}\)the variance is smaller than the variance \({{\bf{\delta }}_{\bf{c}}}\left( {\bf{X}} \right)\)for every other value of c.

Complete the proof of Theorem 8.5.3 by dealing with the case in which r(v, x) is strictly decreasing in v for each x.

Suppose that\({{\bf{X}}_{\bf{1}}},{\bf{ \ldots }},{{\bf{X}}_{\bf{n}}}\)form a random sample from the normal distribution with unknown meanμand known variance\({{\bf{\sigma }}^{\bf{2}}}\). Let\({\bf{\Phi }}\)stand for the c.d.f. of the standard normal distribution, and let\({{\bf{\Phi }}^{{\bf{ - 1}}}}\)be its inverse. Show that

the following interval is a coefficient\(\gamma \)confidence interval forμif\({{\bf{\bar X}}_{\bf{n}}}\)is the observed average of the data values:

\(\left( {{{{\bf{\bar X}}}_{\bf{n}}}{\bf{ - }}{{\bf{\Phi }}^{{\bf{ - 1}}}}\left( {\frac{{{\bf{1 + }}\gamma }}{{\bf{2}}}} \right)\frac{{\bf{\sigma }}}{{{{\bf{n}}^{\frac{{\bf{1}}}{{\bf{2}}}}}}}{\bf{,}}{{{\bf{\bar X}}}_{\bf{n}}}{\bf{ + }}{{\bf{\Phi }}^{{\bf{ - 1}}}}\left( {\frac{{{\bf{1 + }}\gamma }}{{\bf{2}}}} \right)\frac{{\bf{\sigma }}}{{{{\bf{n}}^{\frac{{\bf{1}}}{{\bf{2}}}}}}}} \right)\)

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free