Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Suppose that\(X\) is a random variable for which the p.d.f. or the p.f. is\(f\left( {x|\theta } \right)\) where the value of the parameter \(\theta \) is unknown but must lie in an open interval \(\Omega \). Let \({I_0}\left( \theta \right)\) denote the Fisher information in \(X\) . Suppose now that the

parameter \(\theta \) is replaced by a new parameter \(\mu \), where\(\theta = \psi \left( \mu \right)\) and\(\psi \) is a differentiable function. Let \({I_1}\left( \mu \right)\)denote the Fisher information in X when the parameter is regarded as \(\mu \). Show thatShow that\({I_1}\left( \mu \right) = {\left( {{\psi ^{'}}\left( \mu \right)} \right)^{2}}{I_0}\left( {\psi \left( \mu \right)} \right)\)

Short Answer

Expert verified

\({I_1}\left( \mu \right) = {\left({{\psi^{'}}\left( \mu \right)} \right)^{2}}{I_0} \left( {\psi \left( \mu \right)} \right)\)

Step by step solution

01

Given information

X be the random variable for which the pdf is \(f\left( {x|\theta } \right)\) where \(\theta \) is unknown but it must lie in an open interval\(\Omega \)

\({I_0}\left( \theta \right)\) be the fisher information in X.

The parameter\(\theta \)when replaced by the new parameter\(\mu \), where\(\theta = \psi \left( \mu \right)\)and

\(\psi \)is a differentiable function.

\({I_1}\left( \mu \right)\) denote the fisher information in X when the parameter is regarded as \(\mu \)

02

Fisher information

Fisher information \(I\left( \theta \right)\) in the random variable X is defined as

\(I\left( \theta \right) = {E_\theta }\left\{ {{{\left( {{\lambda ^{'}}\left( {x|\theta } \right)} \right)}^{2}}} \right\}\)

Or

\(I\left( \theta \right) = - {E_\theta }\left\{ {\left( {{\lambda ^{''}}\left( {x|\theta } \right)} \right)} \right\}\)

03

Verifying \({I_1}\left( \mu  \right) = {\left( {{\psi ^{'}}\left( \mu  \right)} \right)^{2}}{I_0}\left( {\psi \left( \mu  \right)} \right)\)

Let \(g\left( {x|\mu } \right)\) be the pdf of X when \(\mu \) is parameter.

Then \(g\left( {x|\mu } \right) = f\left( {x|\psi \left( \mu \right)} \right)\)

Hence taking log on both the sides

\(\begin{align}\log g\left( {x|\mu } \right) &= \log f\left( {x|\psi \left( \mu \right)} \right)\\ &= \lambda \left( {x|\psi \left( \mu \right)} \right)\end{align}\)

Differentiate \(\lambda \left( {x|\psi \left( \mu \right)} \right)\) with respect to \(\mu \)

\(\begin{align}{\lambda ^{'}}\left( {x|\psi \left( \mu \right)} \right) &= \frac{\partial }{{\partial u}}\lambda \left( {x|\psi \left( \mu \right)} \right)\\ &= {\lambda ^{'}}\left( {x|\psi \left( \mu \right)} \right){\psi ^{'}}\left( \mu \right)\end{align}\)

Then firsher information in x when the parameter is regarded as \(\mu \) is

\(\begin{align}{I_1}\left( \mu \right) &= {E_\mu }\left\{ {{{\left( {{\lambda ^{'}}\left( {x|\psi \left( \mu \right)} \right)} \right)}^{2}}} \right\}\\ &= {E_\mu }\left\{ {{{\left( {{\lambda ^{'}}\left( {x|\psi \left( \mu \right)} \right){\psi ^{'}}\left( \mu \right)} \right)}^{2}}} \right\}\\ &= {\left( {{\psi ^{'}}\left( \mu \right)} \right)^{2}}{E_\mu }\left( {{{\left\{ {{\lambda ^{'}}\left( {x|\psi \left( \mu \right)} \right)} \right\}}^{2}}} \right)\end{align}\)

\({I_1}\left( \mu \right) = {\left( {\psi \left( \mu \right)} \right)^2}{I_0}\left( {\psi \left( \mu \right)} \right)\)

hence proved.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Suppose thatX1, . . . , Xnform a random sample from the normal distribution with meanμand variance \({\sigma ^2}\). Find the distribution of

\(\frac{{n{{\left( {{{\bar X}_n} - \mu } \right)}^2}}}{{{\sigma ^2}}}\).

Suppose that\({X_1},...,{X_n}\)form a random sample from the normal distribution with unknown mean μ and unknown variance\({\sigma ^2}\). Describe a method for constructing a confidence interval for\({\sigma ^2}\)with a specified confidence coefficient\(\gamma \left( {0 < \gamma < 1} \right)\) .

Determine whether or not each of the five following matrices is orthogonal:

  1. \(\left( {\begin{align}{\bf{0}}&{\bf{1}}&{\bf{0}}\\{\bf{0}}&{\bf{0}}&{\bf{1}}\\{\bf{1}}&{\bf{0}}&{\bf{0}}\end{align}} \right)\)
  2. \(\left( {\begin{align}{{\bf{0}}{\bf{.8}}}&{\bf{0}}&{{\bf{0}}{\bf{.6}}}\\{{\bf{ - 0}}{\bf{.6}}}&{\bf{0}}&{{\bf{0}}{\bf{.8}}}\\{\bf{0}}&{{\bf{ - 1}}}&{\bf{0}}\end{align}} \right)\)
  3. \(\left( {\begin{align}{{\bf{0}}{\bf{.8}}}&{\bf{0}}&{{\bf{0}}{\bf{.6}}}\\{{\bf{ - 0}}{\bf{.6}}}&{\bf{0}}&{{\bf{0}}{\bf{.8}}}\\{\bf{0}}&{{\bf{0}}{\bf{.5}}}&{\bf{0}}\end{align}} \right)\)
  4. \(\left( {\begin{align}{}{{\bf{ - }}\frac{{\bf{1}}}{{\sqrt {\bf{3}} }}}&{\frac{{\bf{1}}}{{\sqrt {\bf{3}} }}}&{\frac{{\bf{1}}}{{\sqrt {\bf{3}} }}}\\{\frac{{\bf{1}}}{{\sqrt {\bf{3}} }}}&{{\bf{ - }}\frac{{\bf{1}}}{{\sqrt {\bf{3}} }}}&{\frac{{\bf{1}}}{{\sqrt {\bf{3}} }}}\\{\frac{{\bf{1}}}{{\sqrt {\bf{3}} }}}&{\frac{{\bf{1}}}{{\sqrt {\bf{3}} }}}&{{\bf{ - }}\frac{{\bf{1}}}{{\sqrt {\bf{3}} }}}\end{align}} \right)\)
  5. \(\left( {\begin{align}{}{\frac{{\bf{1}}}{{\bf{2}}}}&{\frac{{\bf{1}}}{{\bf{2}}}}&{\frac{{\bf{1}}}{{\bf{2}}}}&{\frac{{\bf{1}}}{{\bf{2}}}}\\{{\bf{ - }}\frac{{\bf{1}}}{{\bf{2}}}}&{{\bf{ - }}\frac{{\bf{1}}}{{\bf{2}}}}&{\frac{{\bf{1}}}{{\bf{2}}}}&{\frac{{\bf{1}}}{{\bf{2}}}}\\{{\bf{ - }}\frac{{\bf{1}}}{{\bf{2}}}}&{\frac{{\bf{1}}}{{\bf{2}}}}&{{\bf{ - }}\frac{{\bf{1}}}{{\bf{2}}}}&{\frac{{\bf{1}}}{{\bf{2}}}}\\{{\bf{ - }}\frac{{\bf{1}}}{{\bf{2}}}}&{\frac{{\bf{1}}}{{\bf{2}}}}&{\frac{{\bf{1}}}{{\bf{2}}}}&{{\bf{ - }}\frac{{\bf{1}}}{{\bf{2}}}}\end{align}} \right)\)

Suppose that\({{\bf{X}}_{\bf{1}}}{\bf{, \ldots ,}}{{\bf{X}}_{\bf{n}}}\)form a random sample from the normal distribution with meanμand variance\({{\bf{\sigma }}^{\bf{2}}}\), and let\({{\bf{\hat \sigma }}^{\bf{2}}}\)denote the sample variance. Determine the smallest values ofnfor which the following relations are satisfied:

  1. \({\bf{Pr}}\left( {\frac{{{{{\bf{\hat \sigma }}}^{\bf{2}}}}}{{{{\bf{\sigma }}^{\bf{2}}}}} \le {\bf{1}}{\bf{.5}}} \right) \ge {\bf{0}}{\bf{.95}}\)
  2. \({\bf{Pr}}\left( {\left| {{{{\bf{\hat \sigma }}}^{\bf{2}}}{\bf{ - }}{{\bf{\sigma }}^{\bf{2}}}} \right| \le \frac{{\bf{1}}}{{\bf{2}}}{{\bf{\sigma }}^{\bf{2}}}} \right) \ge {\bf{0}}{\bf{.8}}\)

Suppose that \({X_1},...,{X_n}\) form a random sample from the Bernoulli distribution with parameter p. Let \({\bar X_n}\) be the sample average. Use the variance stabilizing transformation found in Exercise 5 of Section 6.5 to construct an approximate coefficient γ confidence interval for p

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free