Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Let \(X\) have the \(\operatorname{pmf} p(x ; \theta)=\frac{1}{2}\left(\begin{array}{c}n \\ |x|\end{array}\right) \theta^{|x|}(1-\theta)^{n-|x|}\), for \(x=\pm 1, \pm 2, \ldots, \pm n\), \(p(0, \theta)=(1-\theta)^{n}\), and zero elsewhere, where \(0<\theta<1\) (a) Show that this family \(\\{p(x ; \theta): 0<\theta<1\\}\) is not complete. (b) Let \(Y=|X| .\) Show that \(Y\) is a complete and sufficient statistic for \(\theta\).

Short Answer

Expert verified
The family of functions is not complete, as shown with the function \(x\), which is independent of any sufficient statistic and has an expectation of zero. The statistic \(Y = |X|\) is both complete and sufficient, as it meets the conditions of the Neyman Factorization Theorem and no unbiased estimator can be found that isn't a constant.

Step by step solution

01

Showing the family is not complete

One possible way to show a family is not complete is by giving an example of a function of the observation with mean zero that is not a function of sufficient statistic. Consider the function \(g(x) = x\). It works, because its expectation is 0 under every \(\theta\): \[E[X] = \sum_{x=-n}^{n}xp(x) = 0\] although \(g(x) = x\) is not a function of the sufficient statistic.
02

Definition of \(Y\)

We start the second part of the problem by defining a new statistic \(Y\) as the absolute value of \(X\). Now our task is to prove that \(Y\) is a sufficient and complete statistic for \(\theta\).
03

Demonstrating Sufficiency

According to the Neyman Factorization Theorem, a statistic \(T(X)\) is sufficient for \(\theta\) if the joint pmf/pdfs of X can be factored into two functions. One is a function \(h(x)\), and the other \(g(T, \theta)\), where \(T\) is the sufficient statistics. Here, we have:\[p(x;\theta) = g(T, \theta)h(x)=\frac{1}{2}\left(\begin{array}{c}n \ y\end{array}\right)\theta^y(1-\theta)^{n-y} \]where \(y = Y = |x|\) and \(h(x)\) does not depend on \(\theta\). Hence, \(Y = |X|\) is a sufficient statistics for \(\theta\).
04

Proving Completeness

A statistic is said to be complete for a parameter if every unbiased estimator of 0 is almost sure to be a constant function of the statistic. Let's consider a function \(g(Y)\) which is an unbiased estimator of 0. That means, \(E_{\theta}[g(Y)]=0, for all \(\theta\). If we equate this expectation to zero and solve further, g(Y) is proved to be just a function of theta, hence, depends on no variables and is a constant. Therefore Y = |X| is a complete statistic.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Let \(X_{1}, X_{2}, \ldots, X_{n}\) denote a random sample from a distribution that is \(N(0, \theta)\). Then \(Y=\sum X_{i}^{2}\) is a complete sufficient statistic for \(\theta\). Find the MVUE of \(\theta^{2}\).

If \(X_{1}, X_{2}, \ldots, X_{n}\) is a random sample from a distribution that has a pdf which is a regular case of the exponential class, show that the pdf of \(Y_{1}=\sum_{1}^{n} K\left(X_{i}\right)\) is of the form \(f_{Y_{1}}\left(y_{1} ; \theta\right)=R\left(y_{1}\right) \exp \left[p(\theta) y_{1}+n q(\theta)\right]\). Hint: Let \(Y_{2}=X_{2}, \ldots, Y_{n}=X_{n}\) be \(n-1\) auxiliary random variables. Find the joint pdf of \(Y_{1}, Y_{2}, \ldots, Y_{n}\) and then the marginal pdf of \(Y_{1}\).

Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from a distribution with pmf \(p(x ; \theta)=\theta^{x}(1-\theta), x=0,1,2, \ldots\), zero elsewhere, where \(0 \leq \theta \leq 1\). (a) Find the mle, \(\hat{\theta}\), of \(\theta\). (b) Show that \(\sum_{1}^{n} X_{i}\) is a complete sufficient statistic for \(\theta\). (c) Determine the MVUE of \(\theta\).

Prove that the sum of the observations of a random sample of size \(n\) from a Poisson distribution having parameter \(\theta, 0<\theta<\infty\), is a sufficient statistic for \(\theta\).

Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from the uniform distribution with pdf \(f\left(x ; \theta_{1}, \theta_{2}\right)=1 /\left(2 \theta_{2}\right), \theta_{1}-\theta_{2}0\) and the pdf is equal to zero elsewhere. (a) Show that \(Y_{1}=\min \left(X_{i}\right)\) and \(Y_{n}=\max \left(X_{i}\right)\), the joint sufficient statistics for \(\theta_{1}\) and \(\theta_{2}\), are complete. (b) Find the MVUEs of \(\theta_{1}\) and \(\theta_{2}\).

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free