Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Suppose that X1,…….,Xn form a random sample from the Bernoulli distribution with unknown parameter p. Show that the variance of every unbiased estimator of (1-p)2 must be at least 4p(1-p)3/n.

Short Answer

Expert verified

The variance of every unbiased estimator must be at least

\begin{aligned}\frac{{4p{{\left({1-p}\right)}^3}}}{n}\end{aligned}

Step by step solution

01

Given the information

Suppose that X1,…….,Xn from a random sample from the Bernoulli distribution with an unknown parameter p.

02

Showing the variance

Here, (1-p)2 is an unbiased estimator.

So,

m(p) = (1-p)2

Then,

m’(p) = -2(1-p)

[m’(p)]2 = (-2(1-p))2

= 4(1-p)2

So, the fisher information for Bernoulli distribution is,

L (p) = 1/[p(1-p)]

Therefore, if the T is an unbiased estimator of m(p), it follows the relation,

Vare (T) ≥ [m’(p)]2/ nl(p)

So,

Var (T) ≥ 4(1-p)2/ n1/p(1- p)

= 4p(1-p)3/n

That is,

Var (T) ≥ 4p(1-p)3/n

Hence, proved.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Suppose that two random variables\({\bf{\mu }}\,\,{\bf{and}}\,\,{\bf{\tau }}\)have the joint normal-gamma distribution such that\({\bf{E}}\left( {\bf{\mu }} \right){\bf{ = - 5}}\,\,{\bf{,Var}}\left( {\bf{\mu }} \right){\bf{ = 1}}\,\,\,{\bf{,E}}\left( {\bf{\tau }} \right){\bf{ = }}\frac{{\bf{1}}}{{\bf{2}}}\,\,{\bf{and}}\,\,{\bf{Var}}\left( {\bf{\tau }} \right){\bf{ = }}\frac{{\bf{1}}}{{\bf{8}}}\)Find the prior hyperparameters\({{\bf{\mu }}_{\bf{0}}}{\bf{,}}{{\bf{\lambda }}_{\bf{0}}}{\bf{,}}{{\bf{\alpha }}_{\bf{0}}}{\bf{,}}{{\bf{\beta }}_{\bf{0}}}\)that specify the normal-gamma distribution.

Question: Suppose a random variable X has the Poisson distribution with an unknown mean \({\bf{\lambda }}\) (\({\bf{\lambda }}\)>0). Find a statistic \({\bf{\delta }}\left( {\bf{X}} \right)\) that will be an unbiased estimator of \({{\bf{e}}^{\bf{\lambda }}}\).Hint: If \({\bf{E}}\left( {{\bf{\delta }}\left( {\bf{X}} \right)} \right){\bf{ = }}{{\bf{e}}^{\bf{\lambda }}}\) , then \(\sum\limits_{{\bf{x = 0}}}^\infty {\frac{{{\bf{\delta }}\left( {\bf{x}} \right){{\bf{e}}^{{\bf{ - \lambda }}}}{{\bf{\lambda }}^{\bf{x}}}}}{{{\bf{x!}}}}} = {{\bf{e}}^{\bf{\lambda }}}\)

Multiply both sides of this equation by \({{\bf{e}}^{\bf{\lambda }}}\)expanding the right side in a power series in \({\bf{\lambda }}\), and then equate the coefficients of \({{\bf{\lambda }}^{\bf{x}}}\) on both sides of the equation for x = 0, 1, 2, . . ..

Suppose that X1,…….,Xnform a random sample from the exponential distribution with unknown parameter β. Construct an efficient estimator that is not identically equal to a constant, and determine the expectation and the variance of this estimator.

Question:Suppose that\({{\bf{X}}_{\bf{1}}}{\bf{,}}...{\bf{,}}{{\bf{X}}_{\bf{n}}}\)form a random sample from a distribution for which the p.d.f. or the p.f. is f (x|θ ), where the value of the parameter θ is unknown. Let\({\bf{X = }}\left( {{{\bf{X}}_{\bf{1}}}{\bf{,}}...{\bf{,}}{{\bf{X}}_{\bf{n}}}} \right)\)and let T be a statistic. Assuming that δ(X) is an unbiased estimator of θ, it does not depend on θ. (If T is a sufficient statistic defined in Sec. 7.7, then this will be true for every estimator δ. The condition also holds in other examples.) Let\({{\bf{\delta }}_{\bf{0}}}\left( {\bf{T}} \right)\)denote the conditional mean of δ(X) given T.

a. Show that\({{\bf{\delta }}_{\bf{0}}}\left( {\bf{T}} \right)\)is also an unbiased estimator of θ.

b. Show that\({\bf{Va}}{{\bf{r}}_{\bf{\theta }}}\left( {{{\bf{\delta }}_{\bf{0}}}} \right) \le {\bf{Va}}{{\bf{r}}_{\bf{\theta }}}\left( {\bf{\delta }} \right)\)for every possible value of θ. Hint: Use the result of Exercise 11 in Sec. 4.7.

Suppose that\({{\bf{X}}_{\bf{1}}},{\bf{ \ldots }},{{\bf{X}}_{\bf{n}}}\)form a random sample from the normal distribution with unknown meanμand known variance\({{\bf{\sigma }}^{\bf{2}}}\). Let\({\bf{\Phi }}\)stand for the c.d.f. of the standard normal distribution, and let\({{\bf{\Phi }}^{{\bf{ - 1}}}}\)be its inverse. Show that

the following interval is a coefficient\(\gamma \)confidence interval forμif\({{\bf{\bar X}}_{\bf{n}}}\)is the observed average of the data values:

\(\left( {{{{\bf{\bar X}}}_{\bf{n}}}{\bf{ - }}{{\bf{\Phi }}^{{\bf{ - 1}}}}\left( {\frac{{{\bf{1 + }}\gamma }}{{\bf{2}}}} \right)\frac{{\bf{\sigma }}}{{{{\bf{n}}^{\frac{{\bf{1}}}{{\bf{2}}}}}}}{\bf{,}}{{{\bf{\bar X}}}_{\bf{n}}}{\bf{ + }}{{\bf{\Phi }}^{{\bf{ - 1}}}}\left( {\frac{{{\bf{1 + }}\gamma }}{{\bf{2}}}} \right)\frac{{\bf{\sigma }}}{{{{\bf{n}}^{\frac{{\bf{1}}}{{\bf{2}}}}}}}} \right)\)

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free