Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Problem 3

Let \(X_{1}, X_{2}, \ldots, X_{n}\) denote a random sample of size \(n\) from a distribution with pdf \(f(x ; \theta)=\theta x^{\theta-1}, 00\). (a) Show that the geometric mean \(\left(X_{1} X_{2} \cdots X_{n}\right)^{1 / n}\) of the sample is a complete sufficient statistic for \(\theta\). (b) Find the maximum likelihood estimator of \(\theta\), and observe that it is a function of this geometric mean.

Problem 3

If \(X_{1}, X_{2}\) is a random sample of size 2 from a distribution having pdf \(f(x ; \theta)=(1 / \theta) e^{-x / \theta}, 0

Problem 4

Let the pdf \(f\left(x ; \theta_{1}, \theta_{2}\right)\) be of the form $$ \exp \left[p_{1}\left(\theta_{1}, \theta_{2}\right) K_{1}(x)+p_{2}\left(\theta_{1}, \theta_{2}\right) K_{2}(x)+H(x)+q_{1}\left(\theta_{1}, \theta_{2}\right)\right], \quad a

Problem 4

Let \(X\) and \(Y\) be random variables such that \(E\left(X^{k}\right)\) and \(E\left(Y^{k}\right) \neq 0\) exist for \(k=1,2,3, \ldots\) If the ratio \(X / Y\) and its denominator \(Y\) are independent, prove that \(E\left[(X / Y)^{k}\right]=E\left(X^{k}\right) / E\left(Y^{k}\right), k=1,2,3, \ldots\) Hint: \(\quad\) Write \(E\left(X^{k}\right)=E\left[Y^{k}(X / Y)^{k}\right]\).

Problem 4

Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample of size \(n\) from a geometric distribution that has \(\operatorname{pmf} f(x ; \theta)=(1-\theta)^{x} \theta, x=0,1,2, \ldots, 0<\theta<1\), zero elsewhere. Show that \(\sum_{1}^{n} X_{i}\) is a sufficient statistic for \(\theta\).

Problem 4

Consider the family of probability density functions \(\\{h(z ; \theta): \theta \in \Omega\\}\), where \(h(z ; \theta)=1 / \theta, 01\).

Problem 4

Let \(f(x, y)=\left(2 / \theta^{2}\right) e^{-(x+y) / \theta}, 0

Problem 4

Let \(\bar{X}\) denote the mean of the random sample \(X_{1}, X_{2}, \ldots, X_{n}\) from a gammatype distribution with parameters \(\alpha>0\) and \(\beta=\theta \geq 0 .\) Compute \(E\left[X_{1} \mid \bar{x}\right]\). Hint: \(\quad\) Can you find directly a function \(\psi(\bar{X})\) of \(\bar{X}\) such that \(E[\psi(\bar{X})]=\theta ?\) Is \(E\left(X_{1} \mid \bar{x}\right)=\psi(\bar{x}) ?\) Why?

Problem 4

Let \(Y_{1}\) and \(Y_{2}\) be two independent unbiased estimators of \(\theta\). Assume that the variance of \(Y_{1}\) is twice the variance of \(Y_{2}\). Find the constants \(k_{1}\) and \(k_{2}\) so that \(k_{1} Y_{1}+k_{2} Y_{2}\) is an unbiased estimator with the smallest possible variance for such a linear combination.

Problem 5

Show that the sum of the observations of a random sample of size \(n\) from a gamma distribution that has pdf \(f(x ; \theta)=(1 / \theta) e^{-x / \theta}, 0

Access millions of textbook solutions in one place

  • Access over 3 million high quality textbook solutions
  • Access our popular flashcard, quiz, mock-exam and notes features
  • Access our smart AI features to upgrade your learning
Get Vaia Premium now
Access millions of textbook solutions in one place

Recommended explanations on Math Textbooks