Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Suppose that\({{\bf{X}}_{\bf{1}}}{\bf{,}}...{\bf{,}}{{\bf{X}}_{\bf{n}}}\)form a random sample from the Poisson distribution with unknown mean θ, and let

\({\bf{Y = }}\sum\nolimits_{{\bf{i = 1}}}^{\bf{n}} {{{\bf{X}}_{\bf{i}}}} \).

a. Determine the value of a constant c such that the estimator\({{\bf{e}}^{{\bf{ - cY}}}}\)is an unbiased estimator of\({{\bf{e}}^{{\bf{ - \theta }}}}\).

b. Use the information inequality to obtain a lower bound for the variance of the unbiased estimator found in part (a).

Short Answer

Expert verified
  1. The value of constant c is \(\log \left( {\frac{n}{{n - 1}}} \right)\).
  2. \(Var\left( {\exp \left( { - cY} \right)} \right) \ge \frac{{\theta \exp \left( { - 2\theta } \right)}}{n}\).

Step by step solution

01

Given information

Suppose \({X_1},...,{X_n}\)

\(Y = \sum\nolimits_{i = 1}^n {{X_i}} \)

02

Finding the value of constant c

a.

Since Y has a Poisson distribution with mean \(n\theta \)

Then it follows that,

\(\begin{align}E\left( {\exp \left( { - cY} \right)} \right) &= \sum\limits_{y = 0}^\infty {\frac{{\exp \left( { - cy} \right)\exp \left( { - n\theta } \right){{\left( {n\theta } \right)}^y}}}{{y!}}} \\ &= \exp \left( { - n\theta } \right)\sum\limits_{y = 0}^\infty {\frac{{{{\left( {n\theta \exp \left( { - c} \right)} \right)}^y}}}{{y!}}} \\ &= \exp \left( { - n\theta } \right)\exp \left( {n\theta \exp \left( { - c} \right)} \right)\\ &= \exp \left( {n\theta \left( {\exp \left( { - c} \right) - 1} \right)} \right)\end{align}\)

Since this expectation must be \(\exp \left( { - \theta } \right)\), it follows that

\(\begin{align}n\left( {\exp \left( { - c} \right) - 1} \right) &= - 1\\\left( {\exp \left( { - c} \right) - 1} \right) &= - \frac{1}{n}\\\exp \left( { - c} \right) &= - \frac{1}{n} + 1\\\exp \left( { - c} \right) &= \frac{{ - 1 + n}}{n}\\ - c &= \log \left( {\frac{{n - 1}}{n}} \right)\\c &= \log \left( {\frac{n}{{n - 1}}} \right)\end{align}\)

Therefore, the value of constant c is \(\log \left( {\frac{n}{{n - 1}}} \right)\).

03

Finding the lower bound of the variance

b.

Here,

\(f\left( {x\left| \theta \right.} \right) = \frac{{\exp \left( { - \theta } \right){\theta ^x}}}{{x!}}\)

So, taking log in both sides,

\(\begin{align}\log f\left( {x\left| \theta \right.} \right) &= \lambda \left( {x\left| \theta \right.} \right)\\ &= - \theta + x\log \theta - \log \left( {x!} \right)\end{align}\)

Then,

\(\begin{align}\lambda '\left( {x\left| \theta \right.} \right) &= - 1 + \frac{x}{\theta }\\\lambda ''\left( {x\left| \theta \right.} \right) &= - \frac{x}{{{\theta ^2}}}\end{align}\)

Then,

\(\begin{align}I\left( \theta \right) &= - {E_\theta }\left( {\lambda ''\left( {X\left| \theta \right.} \right)} \right)\\ &= \frac{{E\left( X \right)}}{{{\theta ^2}}}\\ &= \frac{1}{\theta }\end{align}\)

Since \(m\left( \theta \right) = \exp \left( { - \theta } \right)\)

So,\(Var\left( {\exp \left( { - cY} \right)} \right) \ge \frac{{\theta \exp \left( { - 2\theta } \right)}}{n}\)

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Find the mode of theχ2 distribution withmdegrees offreedom(m=1,2, . . .).

For the conditions of Exercise 2, how large a random sample must be taken in order that\({\bf{P}}\left( {{\bf{|}}{{{\bf{\bar X}}}_{\bf{n}}}{\bf{ - \theta |}} \le {\bf{0}}{\bf{.1}}} \right) \ge {\bf{0}}{\bf{.95}}\)for every possible value ofθ?

Suppose that \({{\bf{X}}_{\bf{1}}}{\bf{, }}{\bf{. }}{\bf{. }}{\bf{. , }}{{\bf{X}}_{\bf{n}}}\) form a random sample from the normal distribution with unknown mean \({\bf{\mu }}\,\,{\bf{and}}\,\,{\bf{\tau }}\), and also that the joint prior distribution of \({\bf{\mu }}\,\,{\bf{and}}\,\,{\bf{\tau }}\) is the normal-gamma distribution satisfying the following conditions: \({\bf{E}}\left( {\bf{\tau }} \right){\bf{ = 1,Var}}\left( {\bf{\tau }} \right){\bf{ = }}\frac{{\bf{1}}}{{\bf{3}}}\,\,\,{\bf{and}}\,\,{\bf{Pr}}\left( {{\bf{\mu > 3}}} \right){\bf{ = 0}}{\bf{.5}}\,\,{\bf{and}}\,\,{\bf{Pr}}\left( {{\bf{\mu > 0}}{\bf{.12}}} \right){\bf{ = 0}}{\bf{.9}}\,\)

Determine the prior hyper parameters \({{\bf{\mu }}_{\bf{0}}}{\bf{,}}{{\bf{\lambda }}_{\bf{0}}}{\bf{,}}{{\bf{\alpha }}_{\bf{0}}}{\bf{,}}{{\bf{\beta }}_{\bf{0}}}\)

Suppose that X1,….., Xn forms a random sample from the Bernoulli distribution with unknown parameter p. Show thatX̄n is an efficient estimator of p.

Suppose that\({{\bf{X}}_{\bf{1}}},{\bf{ \ldots }},{{\bf{X}}_{\bf{n}}}\)form a random sample from the normal distribution with unknown meanμand known variance\({{\bf{\sigma }}^{\bf{2}}}\). Let\({\bf{\Phi }}\)stand for the c.d.f. of the standard normal distribution, and let\({{\bf{\Phi }}^{{\bf{ - 1}}}}\)be its inverse. Show that

the following interval is a coefficient\(\gamma \)confidence interval forμif\({{\bf{\bar X}}_{\bf{n}}}\)is the observed average of the data values:

\(\left( {{{{\bf{\bar X}}}_{\bf{n}}}{\bf{ - }}{{\bf{\Phi }}^{{\bf{ - 1}}}}\left( {\frac{{{\bf{1 + }}\gamma }}{{\bf{2}}}} \right)\frac{{\bf{\sigma }}}{{{{\bf{n}}^{\frac{{\bf{1}}}{{\bf{2}}}}}}}{\bf{,}}{{{\bf{\bar X}}}_{\bf{n}}}{\bf{ + }}{{\bf{\Phi }}^{{\bf{ - 1}}}}\left( {\frac{{{\bf{1 + }}\gamma }}{{\bf{2}}}} \right)\frac{{\bf{\sigma }}}{{{{\bf{n}}^{\frac{{\bf{1}}}{{\bf{2}}}}}}}} \right)\)

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free