Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Prove that the distribution of\({\hat \sigma _0}^2\)in Examples 8.2.1and 8.2.2 is the gamma distribution with parameters\(\frac{n}{2}\)and\(\frac{n}{{2{\sigma ^2}}}\).

Short Answer

Expert verified

\({\hat \sigma ^2}\) follows Gamma distribution with parameters \(\frac{n}{2}\) and \(\frac{n}{{2{\sigma ^2}}}\).

Step by step solution

01

Given information

\({X_1},{X_2},...,{X_n}\) are normal random variables.

02

Calculate the probability 

Let \(Z = \frac{{n{{\hat \sigma }^2}}}{{{\sigma ^2}}}\)follows \({\chi ^2}\)a distribution with n degrees of freedom.

Z follows Gamma distribution with parameters \(\frac{n}{2}\)and \(\frac{1}{2}\).

Let, \(c = \frac{{{\sigma ^2}}}{n}\)

Then,

\(\begin{align}Z &= \frac{{n{{\hat \sigma }^2}}}{{{\sigma ^2}}}\\ &= \frac{1}{c}{{\hat \sigma }^2}\\ \Rightarrow cZ &= {{\hat \sigma }^2}\\ \Rightarrow {{\hat \sigma }^2} &= \frac{{{\sigma ^2}}}{n}Z\end{align}\)

Hence,\({\hat \sigma ^2}\) follows Gamma distribution with parameters \(\frac{n}{2}\) and \(\frac{n}{{2{\sigma ^2}}}\).

Hence, proved.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

By using the table of the t distribution given in the back of this book, determine the value of the integral

\(\int\limits_{ - \infty }^{2.5} {\frac{{dx}}{{{{\left( {12 + {x^2}} \right)}^2}}}} \)

Suppose that\({{\bf{X}}_{\bf{1}}}{\bf{,}}...{{\bf{X}}_{\bf{n}}}\)form a random sample from a distribution for which the p.d.f. is as follows:

\({\bf{f}}\left( {{\bf{x}}\left| {\bf{\theta }} \right.} \right){\bf{ = }}\left\{ {\begin{align}{}{{\bf{\theta }}{{\bf{x}}^{{\bf{\theta - 1}}}}}&{{\bf{for}}\,\,{\bf{0 < x < 1,}}}\\{\bf{0}}&{{\bf{otherwise,}}}\end{align}} \right.\)

where the value of θ is unknown (θ > 0). Determine the asymptotic distribution of the M.L.E. of θ. (Note: The M.L.E. was found in Exercise 9 of Sec. 7.5.)

Suppose that X1,……,Xn form a random sample from a normal distribution for which the mean is known and the variance is unknown. Construct an efficient estimator that is not identically equal to a constant, and determine the expectation and the variance of this estimator.

Suppose that each of two statisticians, A and B, independently takes a random sample of 20 observations from the normal distribution with unknown mean μ and known variance 4. Suppose also that statistician A finds the sample variance in his random sample to be 3.8, and statistician B finds the sample variance in her random sample to be 9.4. For which random sample is the sample mean likely to be closer to the unknown value of μ?

Question:Suppose that a random variable X can take only the five values\({\bf{x = 1,2,3,4,5}}\) with the following probabilities:

\(\begin{aligned}{}{\bf{f}}\left( {{\bf{1}}\left| {\bf{\theta }} \right.} \right){\bf{ = }}{{\bf{\theta }}^{\bf{3}}}{\bf{,}}\,\,\,\,{\bf{f}}\left( {{\bf{2}}\left| {\bf{\theta }} \right.} \right){\bf{ = }}{{\bf{\theta }}^{\bf{2}}}\left( {{\bf{1}} - {\bf{\theta }}} \right){\bf{,}}\\{\bf{f}}\left( {{\bf{3}}\left| {\bf{\theta }} \right.} \right){\bf{ = 2\theta }}\left( {{\bf{1}} - {\bf{\theta }}} \right){\bf{,}}\,\,\,{\bf{f}}\left( {{\bf{4}}\left| {\bf{\theta }} \right.} \right){\bf{ = \theta }}{\left( {{\bf{1}} - {\bf{\theta }}} \right)^{\bf{2}}}{\bf{,}}\\{\bf{f}}\left( {{\bf{5}}\left| {\bf{\theta }} \right.} \right){\bf{ = }}{\left( {{\bf{1}} - {\bf{\theta }}} \right)^{\bf{3}}}{\bf{.}}\end{aligned}\)

Here, the value of the parameter θ is unknown (0 ≤ θ ≤ 1).

a. Verify that the sum of the five given probabilities is 1 for every value of θ.

b. Consider an estimator δc(X) that has the following form:

\(\begin{aligned}{}{{\bf{\delta }}_{\bf{c}}}\left( {\bf{1}} \right){\bf{ = 1,}}\,\,{{\bf{\delta }}_{\bf{c}}}\left( {\bf{2}} \right){\bf{ = 2}} - {\bf{2c,}}\,\,{{\bf{\delta }}_{\bf{c}}}\left( {\bf{3}} \right){\bf{ = c,}}\\{{\bf{\delta }}_{\bf{c}}}\left( {\bf{4}} \right){\bf{ = 1}} - {\bf{2c,}}\,\,{{\bf{\delta }}_{\bf{c}}}\left( {\bf{5}} \right){\bf{ = 0}}{\bf{.}}\end{aligned}\)

Show that for each constant, c\({{\bf{\delta }}_{\bf{c}}}\left( {\bf{X}} \right)\)is an unbiased estimator of θ.

c. Let\({{\bf{\theta }}_{\bf{0}}}\)be a number such that\({\bf{0 < }}{{\bf{\theta }}_{\bf{0}}}{\bf{ < 1}}\). Determine a constant\({{\bf{c}}_{\bf{0}}}\)such that when\({\bf{\theta = }}{{\bf{\theta }}_{\bf{0}}}\)the variance is smaller than the variance \({{\bf{\delta }}_{\bf{c}}}\left( {\bf{X}} \right)\)for every other value of c.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free