Chapter 7: Problem 10
Referring to Example \(7.9 .5\) of this section, determine \(c\) so that
$$P\left(-c
Short Answer
Step by step solution
Key Concepts
These are the key concepts you need to understand to accurately answer the question.
Chapter 7: Problem 10
Referring to Example \(7.9 .5\) of this section, determine \(c\) so that
$$P\left(-c
These are the key concepts you need to understand to accurately answer the question.
All the tools & learning materials you need for study success - in one app.
Get started for freeLet \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from a distribution with pdf \(f(x ; \theta)=\theta^{x}(1-\theta), x=0,1,2, \ldots\), zero elsewhere, where \(0 \leq \theta \leq 1\) (a) Find the mle, \(\hat{\theta}\), of \(\theta\). (b) Show that \(\sum_{1}^{n} X_{i}\) is a complete sufficient statistic for \(\theta\). (c) Determine the MVUE of \(\theta\).
Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from a Poisson distribution with parameter \(\theta>0\) (a) Find the MVUE of \(P(X \leq 1)=(1+\theta) e^{-\theta}\). Hint: \(\quad\) Let \(u\left(x_{1}\right)=1, x_{1} \leq 1\), zero elsewhere, and find \(E\left[u\left(X_{1}\right) \mid Y=y\right]\), where \(Y=\sum_{1}^{n} X_{i}\). (b) Express the MVUE as a function of the mle. (c) Determine the asymptotic distribution of the mle.
We consider a random sample \(X_{1}, X_{2}, \ldots, X_{n}\) from a distribution
with pdf \(f(x ; \theta)=(1 / \theta) \exp (-x / \theta), 0
Let \(\left(X_{1}, Y_{1}\right),\left(X_{2}, Y_{2}\right), \ldots,\left(X_{n}, Y_{n}\right)\) denote a random sample of size \(n\) from a bivariate normal distribution with means \(\mu_{1}\) and \(\mu_{2}\), positive variances \(\sigma_{1}^{2}\) and \(\sigma_{2}^{2}\), and correlation coefficient \(\rho .\) Show that \(\sum_{1}^{n} X_{i}, \sum_{1}^{n} Y_{i}, \sum_{1}^{n} X_{i}^{2}, \sum_{1}^{n} Y_{i}^{2}\), and \(\sum_{1}^{n} X_{i} Y_{i}\) are joint complete sufficient statistics for the five parameters. Are \(\bar{X}=\) \(\sum_{1}^{n} X_{i} / n, \bar{Y}=\sum_{1}^{n} Y_{i} / n, S_{1}^{2}=\sum_{1}^{n}\left(X_{i}-\bar{X}\right)^{2} /(n-1), S_{2}^{2}=\sum_{1}^{n}\left(Y_{i}-\bar{Y}\right)^{2} /(n-1)\), and \(\sum_{1}^{n}\left(X_{i}-\bar{X}\right)\left(Y_{i}-\bar{Y}\right) /(n-1) S_{1} S_{2}\) also joint complete sufficient statistics for these parameters?
Let \(X\) be a random variable with pdf of a regular case of the exponential class. Show that \(E[K(X)]=-q^{\prime}(\theta) / p^{\prime}(\theta)\), provided these derivatives exist, by differentiating both members of the equality $$\int_{a}^{b} \exp [p(\theta) K(x)+S(x)+q(\theta)] d x=1$$ with respect to \(\theta\). By a second differentiation, find the variance of \(K(X)\).
What do you think about this solution?
We value your feedback to improve our textbook solutions.