Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Let \(X_{1}, X_{2}, \ldots, X_{n}, n>2\), be a random sample from the binomial distribution \(b(1, \theta)\). (a) Show that \(Y_{1}=X_{1}+X_{2}+\cdots+X_{n}\) is a complete sufficient statistic for \(\theta\). (b) Find the function \(\varphi\left(Y_{1}\right)\) that is the MVUE of \(\theta\). (c) Let \(Y_{2}=\left(X_{1}+X_{2}\right) / 2\) and compute \(E\left(Y_{2}\right)\). (d) Determine \(E\left(Y_{2} \mid Y_{1}=y_{1}\right)\).

Short Answer

Expert verified
\((a)\) \(Y_{1}\) is a complete sufficient statistic for \(\theta\). \((b)\) The MVUE of \(\theta\) is \(Y_{1}/n\). \((c)\) \(E\left(Y_{2}\right) = \theta\). \((d)\) \(E\left(Y_{2} \mid Y_{1}=y_{1}\right) = \theta\)

Step by step solution

01

Show that \(Y_{1}\) is a complete sufficient statistic for \(\theta\)

We know that \(Y_{1}=X_{1}+X_{2}+\cdots+X_{n}\) has a Binomial distribution being the sum of \(n\) binomial random variables, which is known to be sufficient for \(\theta\). A statistic is complete if every unbiased estimator depends on it. This means that if \(E(g(Y_{1})) = 0 \) for all \(\theta\), then we must have \(g(Y_{1}) = 0\) almost surely. Since any estimator must depend only on \(Y_{1}\), we can therefore say \(Y_{1}\) is a complete sufficient statistic for \(\theta\).
02

Find the MVUE of \(\theta\)

The Minimum Variance Unbiased Estimator (MVUE) is the estimator that has the smallest variance among all unbiased estimators of \(\theta\). The estimator \(Y_{1}/n\) is unbiased for \(\theta\) as \(E(Y_{1}/n)=(n\theta)/n=\theta\) and depends upon the complete sufficient statistic \(Y_{1}\). Hence, by the Lehmann-Scheffe Theorem, \(Y_{1}/n\) is the MVUE of \(\theta\). Thus, \(\varphi\left(Y_{1}\right) = Y_{1}/n\)
03

Compute \(E\left(Y_{2}\right)\)

Recall that \(Y_{2}=\left(X_{1}+X_{2}\right) / 2\). The expectation \(E(Y_{2})\) is then found as \(E(Y_{2})=E[\frac{1}{2}(X_{1}+X_{2})] = \frac{1}{2}(E(X_{1})+E(X_{2})). As X's are Binomial random variables with parameters (1, \(\theta\)), \(E(X_{i})=\theta\). Thus, \(E(Y_{2})=(\theta + \theta)/2 = \theta\)
04

Determine \(E\left(Y_{2} \mid Y_{1}=y_{1}\right)\)

Note \(Y_{1}\) and \(Y_{2}\) are dependent since \(Y_{1}\) includes \(X_1+X_2\). The conditional expectation \(E\left(Y_{2} \mid Y_{1} = y_{1}\right) \) can be computed using the definition of conditional expectation and the distribution of \(Y_{1}\) and \(Y_{2}\). However, as \(Y_{1}\) is a complete sufficient statistic for \(\theta\), \(\theta\) is independent of \(Y_{1}\). Hence, \(E\left(Y_{2} \mid Y_{1} = y_{1}\right) = E(Y_2) = \theta\)

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Let \(X_{1}, X_{2}, \ldots, X_{n}\) denote a random sample from a normal distribution with mean zero and variance \(\theta, 0<\theta<\infty\). Show that \(\sum_{1}^{n} X_{i}^{2} / n\) is an unbiased estimator of \(\theta\) and has variance \(2 \theta^{2} / n\).

Let \(\bar{X}\) denote the mean of the random sample \(X_{1}, X_{2}, \ldots, X_{n}\) from a gammatype distribution with parameters \(\alpha>0\) and \(\beta=\theta \geq 0 .\) Compute \(E\left[X_{1} \mid \bar{x}\right]\). Hint: \(\quad\) Can you find directly a function \(\psi(\bar{X})\) of \(\bar{X}\) such that \(E[\psi(\bar{X})]=\theta ?\) Is \(E\left(X_{1} \mid \bar{x}\right)=\psi(\bar{x}) ?\) Why?

Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from each of the following distributions involving the parameter \(\theta .\) In each case find the mle of \(\theta\) and show that it is a sufficient statistic for \(\theta\) and hence a minimal sufficient statistic. (a) \(b(1, \theta)\), where \(0 \leq \theta \leq 1\). (b) Poisson with mean \(\theta>0\). (c) Gamma with \(\alpha=3\) and \(\beta=\theta>0\). (d) \(N(\theta, 1)\), where \(-\infty<\theta \leq \infty\). (e) \(N(0, \theta)\), where \(0<\theta<\infty\).

Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample of size \(n\) from a geometric distribution that has \(\operatorname{pmf} f(x ; \theta)=(1-\theta)^{x} \theta, x=0,1,2, \ldots, 0<\theta<1\), zero elsewhere. Show that \(\sum_{1}^{n} X_{i}\) is a sufficient statistic for \(\theta\).

Let \(X_{1}, \ldots, X_{n}\) be iid with pdf \(f(x ; \theta)=1 /(3 \theta),-\theta0\). (a) Find the mle \(\hat{\theta}\) of \(\theta\). (b) Is \(\widehat{\theta}\) a sufficient statistic for \(\theta\) ? Why? (c) Is \((n+1) \widehat{\theta} / n\) the unique MVUE of \(\theta\) ? Why?

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free