Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Question:Suppose that a random variable X has the geometric distribution with an unknown parameter p. (See Sec. 5.5.).Find a statistic \({\bf{\delta }}\left( {\bf{X}} \right)\)that will be an unbiased estimator of\(\frac{{\bf{1}}}{{\bf{p}}}\).

Short Answer

Expert verified

\(\delta \left( X \right) = \overline X \) is an unbiased estimator of \(\frac{1}{p}\)

Step by step solution

01

Given information

It is given that the random variable X has the geometric distribution with an unknown parameter p.

02

Define the pdf and find the expectation

Therefore, the pdf of X is \(P\left( {X = x} \right) = p{\left( {1 - p} \right)^{x - 1}},x > 0\)

The expectation is:

\(\begin{aligned}{}\sum x P\left( {X = x} \right) &= \sum\limits_{i = 1}^n {{x_i}} p{\left( {1 - p} \right)^{{x_i} - 1}}\\ &= \frac{1}{p}\end{aligned}\)

Therefore, the expectation is \(\frac{1}{p}\) .

03

Define the unbiased estimator

An estimator \(\delta \left( X \right)\,\,of\,\,g\left( \theta \right)\)is unbiased if \(E\left( {\delta \left( X \right)} \right) = g\left( \theta \right)\)for all possible values of \(\theta \) .

We also know that sample mean is an unbiased estimator of population mean.

Therefore,

\(\begin{aligned}{}E\left( {\delta \left( X \right)} \right) &= g\left( \theta \right)\\ \Rightarrow E\left( {\overline X } \right) &= \frac{1}{p}\end{aligned}\)

Therefore \(\delta \left( X \right) = \overline X \) is an unbiased estimator of \(\frac{1}{p}\).

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Question:Reconsider the conditions of Exercise 3. Suppose that n = 2, and we observe\({{\bf{X}}_{\bf{1}}}{\bf{ = 2}}\,\,{\bf{and}}\,\,{{\bf{X}}_{\bf{2}}}{\bf{ = - 1}}\). Compute the value of the unbiased estimator of\({\left[ {{\bf{E}}\left( {\bf{X}} \right)} \right]^{\bf{2}}}\) found in Exercise 3. Describe a flaw that you have discovered in the estimator.

Suppose that two random variables\({\bf{\mu }}\,\,{\bf{and}}\,\,{\bf{\tau }}\)have the joint normal-gamma distribution such that\({\bf{E}}\left( {\bf{\mu }} \right){\bf{ = - 5}}\,\,{\bf{,Var}}\left( {\bf{\mu }} \right){\bf{ = 1}}\,\,\,{\bf{,E}}\left( {\bf{\tau }} \right){\bf{ = }}\frac{{\bf{1}}}{{\bf{2}}}\,\,{\bf{and}}\,\,{\bf{Var}}\left( {\bf{\tau }} \right){\bf{ = }}\frac{{\bf{1}}}{{\bf{8}}}\)Find the prior hyperparameters\({{\bf{\mu }}_{\bf{0}}}{\bf{,}}{{\bf{\lambda }}_{\bf{0}}}{\bf{,}}{{\bf{\alpha }}_{\bf{0}}}{\bf{,}}{{\bf{\beta }}_{\bf{0}}}\)that specify the normal-gamma distribution.

Suppose that\({{\bf{X}}_{\bf{1}}}{\bf{,}}{{\bf{X}}_{\bf{2}}}{\bf{,}}...{\bf{,}}{{\bf{X}}_{\bf{n}}}\)form a random sample from the uniform distribution on the interval\(\left( {{\bf{0,1}}} \right)\), and let\({\bf{W}}\)denote the range of the sample, as defined in Example 3.9.7. Also, let\({{\bf{g}}_{\bf{n}}}\left( {\bf{x}} \right)\)denote the p.d.f of the random

variable\({\bf{2n}}\left( {{\bf{1 - W}}} \right)\), and let\({\bf{g}}\left( {\bf{x}} \right)\)denote the p.d.f of the\({\chi ^{\bf{2}}}\)distribution with four degrees of freedom. Show that

\(\mathop {{\bf{lim}}}\limits_{{\bf{n}} \to \infty } {{\bf{g}}_{\bf{n}}}\left( {\bf{x}} \right){\bf{ = g}}\left( {\bf{x}} \right)\) for\({\bf{x > 0}}\).

Question:For the conditions of Exercise 2, find an unbiased estimator of \({\left( {{\bf{E}}\left( {\bf{X}} \right)} \right)^{\bf{2}}}\). Hint: \({\left( {{\bf{E}}\left( {\bf{X}} \right)} \right)^{\bf{2}}}{\bf{ = E}}\left( {{{\bf{X}}^{\bf{2}}}} \right){\bf{ - Var}}\left( {\bf{X}} \right)\)

At the end of Example 8.5.11, compute the probability that \(\left| {{{\bar X}_2} - \theta } \right| < 0.3\) given Z = 0.9. Why is it so large?

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free