Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Question:Suppose that a random variable X can take only the five values\({\bf{x = 1,2,3,4,5}}\) with the following probabilities:

\(\begin{aligned}{}{\bf{f}}\left( {{\bf{1}}\left| {\bf{\theta }} \right.} \right){\bf{ = }}{{\bf{\theta }}^{\bf{3}}}{\bf{,}}\,\,\,\,{\bf{f}}\left( {{\bf{2}}\left| {\bf{\theta }} \right.} \right){\bf{ = }}{{\bf{\theta }}^{\bf{2}}}\left( {{\bf{1}} - {\bf{\theta }}} \right){\bf{,}}\\{\bf{f}}\left( {{\bf{3}}\left| {\bf{\theta }} \right.} \right){\bf{ = 2\theta }}\left( {{\bf{1}} - {\bf{\theta }}} \right){\bf{,}}\,\,\,{\bf{f}}\left( {{\bf{4}}\left| {\bf{\theta }} \right.} \right){\bf{ = \theta }}{\left( {{\bf{1}} - {\bf{\theta }}} \right)^{\bf{2}}}{\bf{,}}\\{\bf{f}}\left( {{\bf{5}}\left| {\bf{\theta }} \right.} \right){\bf{ = }}{\left( {{\bf{1}} - {\bf{\theta }}} \right)^{\bf{3}}}{\bf{.}}\end{aligned}\)

Here, the value of the parameter θ is unknown (0 ≤ θ ≤ 1).

a. Verify that the sum of the five given probabilities is 1 for every value of θ.

b. Consider an estimator δc(X) that has the following form:

\(\begin{aligned}{}{{\bf{\delta }}_{\bf{c}}}\left( {\bf{1}} \right){\bf{ = 1,}}\,\,{{\bf{\delta }}_{\bf{c}}}\left( {\bf{2}} \right){\bf{ = 2}} - {\bf{2c,}}\,\,{{\bf{\delta }}_{\bf{c}}}\left( {\bf{3}} \right){\bf{ = c,}}\\{{\bf{\delta }}_{\bf{c}}}\left( {\bf{4}} \right){\bf{ = 1}} - {\bf{2c,}}\,\,{{\bf{\delta }}_{\bf{c}}}\left( {\bf{5}} \right){\bf{ = 0}}{\bf{.}}\end{aligned}\)

Show that for each constant, c\({{\bf{\delta }}_{\bf{c}}}\left( {\bf{X}} \right)\)is an unbiased estimator of θ.

c. Let\({{\bf{\theta }}_{\bf{0}}}\)be a number such that\({\bf{0 < }}{{\bf{\theta }}_{\bf{0}}}{\bf{ < 1}}\). Determine a constant\({{\bf{c}}_{\bf{0}}}\)such that when\({\bf{\theta = }}{{\bf{\theta }}_{\bf{0}}}\)the variance is smaller than the variance \({{\bf{\delta }}_{\bf{c}}}\left( {\bf{X}} \right)\)for every other value of c.

Short Answer

Expert verified

a. Proved. The sum of given probabilities is 1 for every value of\(\theta \)

b. Proved. For each constant c,\({\delta _c}\left( X \right)\)is an unbiased estimator of\(\theta \)

c. The value of c is \(\frac{{\left( {1 + {\theta _0}} \right)}}{3}\)

Step by step solution

01

Given information

A random variable X takes five values\(x = 1,2,3,4,5\)with probabilities,

\(\begin{aligned}{}f\left( {1\left| \theta \right.} \right) &= {\theta ^3},\,\,\,\,f\left( {2\left| \theta \right.} \right) &= {\theta ^2}\left( {1 - \theta } \right),\\f\left( {3\left| \theta \right.} \right) &= 2\theta \left( {1 - \theta } \right),\,\,\,f\left( {4\left| \theta \right.} \right) &= \theta {\left( {1 - \theta } \right)^2}\\f\left( {5\left| \theta \right.} \right) &= {\left( {1 - \theta } \right)^3}\end{aligned}\)

An estimator\({\delta _c}\left( X \right)\)the form,

\(\begin{aligned}{}{\delta _c}\left( 1 \right) &= 1,\,\,{\delta _c}\left( 2 \right) &= 2 - 2c,\,\,{\delta _c}\left( 3 \right) &= c,\\{\delta _c}\left( 4 \right) &= 1 - 2c,\,\,{\delta _c}\left( 5 \right) &= 0.\end{aligned}\)

02

(a) Checking the probabilities

\(\begin{aligned}{}f\left( {1\left| \theta \right.} \right) + f\left( {2\left| \theta \right.} \right) &= {\theta ^3} + {\theta ^2}\left( {1 - \theta } \right)\\ &= {\theta ^3} + {\theta ^2} - {\theta ^3}\\ &= {\theta ^2}\end{aligned}\)

\(\begin{aligned}{}f\left( {4\left| \theta \right.} \right) + f\left( {5\left| \theta \right.} \right) &= \theta {\left( {1 - \theta } \right)^2} + {\left( {1 - \theta } \right)^3}\\ &= {\left( {1 - \theta } \right)^2}\left( {\theta + 1 - \theta } \right)\\ &= {\left( {1 - \theta } \right)^2}\end{aligned}\)

\(f\left( {3\left| \theta \right.} \right) = 2\theta \left( {1 - \theta } \right)\)

The total of the five probabilities on the left sides of these equations equals the sum of the probabilities on the right sides, which is

\(\begin{aligned}{}{\theta ^2} + {\left( {1 - \theta } \right)^2} + 2\theta \left( {1 - \theta } \right) &= {\theta ^2} + 1 - 2\theta + {\theta ^2} + 2\theta - 2{\theta ^2}\\ &= 1\end{aligned}\)

Hence, the sum of given probabilities is 1 for every value of \(\theta \)

03

(b) Finding the unbiased estimator

\(\begin{aligned}{}{E_\theta }\left( {{\delta _c}\left( X \right)} \right) &= \sum\limits_{x = 1}^5 {{\delta _c}\left( x \right)f\left( {x\left| \theta \right.} \right)} \\ &= \left( {1 \times {\theta ^3}} \right) + \left( {\left( {2 - 2c} \right) \times {\theta ^2} \times \left( {1 - \theta } \right)} \right) + \left( {c \times 2\theta \left( {1 - \theta } \right)} \right) + \left( {\left( {1 - 2c} \right) \times \theta \times {{\left( {1 - \theta } \right)}^2}} \right) + 0\\ &= \theta \end{aligned}\)

The sum of the coefficients of\({\theta ^3}\)is 0, the sum of the coefficients of\({\theta ^2}\)is 0, the sum of coefficients of\(\theta \)is 1, and the constant term is 0.

Hence,\(E\left[ {{\delta _c}\left( X \right)} \right] = \theta \)

Hence, for each constant c, \({\delta _c}\left( X \right)\) is an unbiased estimator of \(\theta \)

04

Finding the value of c

For every value of c,

\(\begin{aligned}{}Va{r_{{\theta _0}}}\left( {{\delta _c}} \right) &= {E_{{\theta _0}}}\left( {\delta _c^2} \right) - {\left( {{E_{{\theta _0}}}\left( {{\delta _c}} \right)} \right)^2}\\ &= {E_{{\theta _0}}}\left( {\delta _c^2} \right) - {\theta ^2}\end{aligned}\)

Hence, the value of c for which\(Va{r_{{\theta _0}}}\left( {{\delta _c}} \right)\)is a minimum will be the value of c for which\({E_{{\theta _0}}}\left( {\delta _c^2} \right)\)is a minimum

Now,

\(\begin{aligned}{}{E_{{\theta _0}}}\left( {\delta _c^2} \right) &= \left( {{1^2} \times \theta _0^3} \right) + {\left( {2 - 2c} \right)^2}\left( {1 - {\theta _0}} \right) + {c^2}2{\theta _0}\left( {1 - {\theta _0}} \right) + {\left( {1 - 2c} \right)^2}{\theta _0}{\left( {1 - {\theta _0}} \right)^2} + 0\\ = 2{c^2}\left[ {2\theta _0^2\left( {1 - {\theta _0}} \right) + {\theta _0}\left( {1 - {\theta _0}} \right) + 2{\theta _0}{{\left( {1 - {\theta _0}} \right)}^2}} \right] - 4c\left[ {2\theta _0^2\left( {1 - {\theta _0}} \right) + {\theta _0}{{\left( {1 - {\theta _0}} \right)}^2}} \right] + \,\,terms\,not\,\,c\end{aligned}\)

After further simplification of the coefficients of\({c^2}\,\,and\,\,c\), we obtain the relation

\({E_{{\theta _0}}}\left( {\delta _c^2} \right) = 6{\theta _0}\left( {1 - {\theta _0}} \right){c^2} + 4{\theta _0}\left( {1 - \theta _0^2} \right)c + \,terms\,\,not\,\,involving\,\,c\)

By differentiating concerning c and setting the derivative equal to 0, it is found that the value of c for which\({E_{{\theta _0}}}\left( {\delta _c^2} \right)\)is minimum is\(c = \frac{{\left( {1 + {\theta _0}} \right)}}{3}\)

Therefore, the value of c is \(\frac{{\left( {1 + {\theta _0}} \right)}}{3}\)

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

In the June 1986 issue of Consumer Reports, some data on the calorie content of beef hot dogs is given. Here are the numbers of calories in 20 different hot dog brands:

186,181,176,149,184,190,158,139,175,148,

152,111,141,153,190,157,131,149,135,132.

Assume that these numbers are the observed values from a random sample of twenty independent standard random variables with meanμand variance \({{\bf{\sigma }}^{\bf{2}}}\), both unknown. Find a 90% confidence interval for the mean number of caloriesμ.

Suppose that\({X_1}...{X_n}\)form a random sample from the normal distribution with mean 0 and unknown standard deviation\(\sigma > 0\). Find the lower bound specified by the information inequality for the variance of any unbiased estimator of\(\log \sigma \).

Suppose that a random variable X has the exponential distribution with meanθ, which is unknown(θ >0). Find the Fisher informationI(θ)inX.

Suppose that \({X_1},...,{X_n}\) form a random sample from the normal distribution with known mean μ and unknown precision \(\tau \left( {\tau > 0} \right)\). Suppose also that the prior distribution of \(\tau \) is the gamma distribution with parameters\({\alpha _0}\,\,\,{\rm{and}}\,\,\,\,{\beta _0}\left( {{\alpha _0} > 0\,\,\,{\rm{and}}\,\,\,{\beta _0} > 0} \right)\) . Show that the posterior distribution of \(\tau \) given that \({X_i} = {x_i}\) (i = 1, . . . , n) is the gamma distribution with parameters \({\alpha _0} + \frac{n}{2}\,\,\,\,{\rm{and}}\,\,\,\,\,{\beta _0} + \frac{1}{2}\sum\limits_{i = 1}^N {{{\left( {{x_i} - \mu } \right)}^2}} \).

At the end of Example 8.5.11, compute the probability that \(\left| {{{\bar X}_2} - \theta } \right| < 0.3\) given Z = 0.9. Why is it so large?

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free