Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Consider again the conditions of Exercise 19, and let\({{\bf{\hat \beta }}_{\bf{n}}}\)n denote the M.L.E. of β.

a. Use the delta method to determine the asymptotic distribution of\(\frac{{\bf{1}}}{{{{{\bf{\hat \beta }}}_{\bf{n}}}}}\).

b. Show that\(\frac{{\bf{1}}}{{{{{\bf{\hat \beta }}}_{\bf{n}}}}}{\bf{ = }}{{\bf{\bar X}}_{\bf{n}}}\), and use the central limit theorem to determine the asymptotic distribution of\(\frac{{\bf{1}}}{{{{{\bf{\hat \beta }}}_{\bf{n}}}}}\).

Short Answer

Expert verified
  1. The asymptotic distribution is a standard normal distribution.
  2. Proved.

Step by step solution

01

Given information

Referring to exercise 19, suppose that\({X_1},...,{X_n}\) from a random sample from the exponential distribution with an unknown parameter \(\beta \).

The p.d.f. of an exponential distribution is,

\(f\left( {x\left| \beta \right.} \right) = \beta \exp \left( { - \beta x} \right)\)

02

Finding the asymptotic distribution

a.

The p.d.f. of an exponential distribution is,

\(f\left( {x\left| \beta \right.} \right) = \beta \exp \left( { - \beta x} \right)\)

The mean of the exponential distribution is,

\(E\left( x \right) = \frac{1}{\beta }\)

Let,

\(\alpha \left( \beta \right) = \frac{1}{\beta }\)

Then,

\(\alpha '\left( \beta \right) = - \frac{1}{{{\beta ^2}}}\)

It is known that\({\hat \beta _n}\)is approximately normal with mean\(\beta \)and variance\(\frac{{{\beta ^2}}}{n}\)

The delta method is a general technique for calculating the variance of a function of known variance asymptotically normal random variables. The delta technique is used in this instance to construct a closed-form solution for the margin's standard errors by making use of the fact that the margin is (typically) an endlessly differentiable function of the data, X, and the vector of\(\beta s\).

Therefore, \(\frac{1}{{{{\hat \beta }_n}}}\) will be approximately normal with mean \(\frac{1}{\beta }\) and the variance \({\left( {\alpha '\left( \beta \right)} \right)^2}\left( {\frac{{{\beta ^2}}}{n}} \right) = \frac{1}{{\left( {n{\beta ^2}} \right)}}\)

Equivalently, the asymptotic distribution of \({\left( {n{\beta ^2}} \right)^{\frac{1}{2}}}\left( {\frac{1}{{{{\hat \beta }_n}}} - \frac{1}{\beta }} \right)\) is standard normal distribution.

03

Proving part

b.

Since the mean of the exponential distribution is\(\frac{1}{\beta }\)and variance is\(\frac{1}{{{\beta ^2}}}\)

It follows directly from the central limit theorem that the asymptotic distribution of

\({\bar X_n} = \frac{1}{{{{\hat \beta }_n}}}\)is exactly found in part (a).

Hence, (Proved)

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

At the end of Example 8.5.11, compute the probability that \(\left| {{{\bar X}_2} - \theta } \right| < 0.3\) given Z = 0.9. Why is it so large?

Suppose a random variable has a normal distribution with a mean of 0 and an unknown standard deviation σ> 0. Find the Fisher information I (σ) in X.

Suppose that \({{\bf{X}}_{\bf{1}}}{\bf{, }}{\bf{. }}{\bf{. }}{\bf{. , }}{{\bf{X}}_{\bf{n}}}\) form a random sample from the normal distribution with unknown mean \({\bf{\mu }}\,\,{\bf{and}}\,\,{\bf{\tau }}\), and also that the joint prior distribution of \({\bf{\mu }}\,\,{\bf{and}}\,\,{\bf{\tau }}\) is the normal-gamma distribution satisfying the following conditions: \({\bf{E}}\left( {\bf{\mu }} \right){\bf{ = 0}}\,\,\,\,{\bf{,E}}\left( {\bf{\tau }} \right){\bf{ = 2,E}}\left( {{{\bf{\tau }}^{\bf{2}}}} \right){\bf{ = 5}}\,\,\,{\bf{and}}\,\,{\bf{Pr}}\left( {\left| {\bf{\mu }} \right|{\bf{ < 1}}{\bf{.412}}} \right){\bf{ = 0}}{\bf{.5}}\)Determine the prior hyperparameters \({{\bf{\mu }}_{\bf{0}}}{\bf{,}}{{\bf{\lambda }}_{\bf{0}}}{\bf{,}}{{\bf{\alpha }}_{\bf{0}}}{\bf{,}}{{\bf{\beta }}_{\bf{0}}}\)

Question:Suppose that a random variable X can take only the five values\({\bf{x = 1,2,3,4,5}}\) with the following probabilities:

\(\begin{aligned}{}{\bf{f}}\left( {{\bf{1}}\left| {\bf{\theta }} \right.} \right){\bf{ = }}{{\bf{\theta }}^{\bf{3}}}{\bf{,}}\,\,\,\,{\bf{f}}\left( {{\bf{2}}\left| {\bf{\theta }} \right.} \right){\bf{ = }}{{\bf{\theta }}^{\bf{2}}}\left( {{\bf{1}} - {\bf{\theta }}} \right){\bf{,}}\\{\bf{f}}\left( {{\bf{3}}\left| {\bf{\theta }} \right.} \right){\bf{ = 2\theta }}\left( {{\bf{1}} - {\bf{\theta }}} \right){\bf{,}}\,\,\,{\bf{f}}\left( {{\bf{4}}\left| {\bf{\theta }} \right.} \right){\bf{ = \theta }}{\left( {{\bf{1}} - {\bf{\theta }}} \right)^{\bf{2}}}{\bf{,}}\\{\bf{f}}\left( {{\bf{5}}\left| {\bf{\theta }} \right.} \right){\bf{ = }}{\left( {{\bf{1}} - {\bf{\theta }}} \right)^{\bf{3}}}{\bf{.}}\end{aligned}\)

Here, the value of the parameter θ is unknown (0 ≤ θ ≤ 1).

a. Verify that the sum of the five given probabilities is 1 for every value of θ.

b. Consider an estimator δc(X) that has the following form:

\(\begin{aligned}{}{{\bf{\delta }}_{\bf{c}}}\left( {\bf{1}} \right){\bf{ = 1,}}\,\,{{\bf{\delta }}_{\bf{c}}}\left( {\bf{2}} \right){\bf{ = 2}} - {\bf{2c,}}\,\,{{\bf{\delta }}_{\bf{c}}}\left( {\bf{3}} \right){\bf{ = c,}}\\{{\bf{\delta }}_{\bf{c}}}\left( {\bf{4}} \right){\bf{ = 1}} - {\bf{2c,}}\,\,{{\bf{\delta }}_{\bf{c}}}\left( {\bf{5}} \right){\bf{ = 0}}{\bf{.}}\end{aligned}\)

Show that for each constant, c\({{\bf{\delta }}_{\bf{c}}}\left( {\bf{X}} \right)\)is an unbiased estimator of θ.

c. Let\({{\bf{\theta }}_{\bf{0}}}\)be a number such that\({\bf{0 < }}{{\bf{\theta }}_{\bf{0}}}{\bf{ < 1}}\). Determine a constant\({{\bf{c}}_{\bf{0}}}\)such that when\({\bf{\theta = }}{{\bf{\theta }}_{\bf{0}}}\)the variance is smaller than the variance \({{\bf{\delta }}_{\bf{c}}}\left( {\bf{X}} \right)\)for every other value of c.

Question: Suppose that a random variable X has the Poisson distribution with unknown mean \({\bf{\theta }}\) >0. Find the Fisher information \({\bf{I}}\left( {\bf{\theta }} \right)\) in X.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free