Chapter 7: Problem 1
Let \(X_{1}, X_{2}, \ldots, X_{n}\) be iid \(N(0, \theta), 0<\theta<\infty\). Show that \(\sum_{1}^{n} X_{i}^{2}\) is a sufficient statistic for \(\theta\).
Chapter 7: Problem 1
Let \(X_{1}, X_{2}, \ldots, X_{n}\) be iid \(N(0, \theta), 0<\theta<\infty\). Show that \(\sum_{1}^{n} X_{i}^{2}\) is a sufficient statistic for \(\theta\).
All the tools & learning materials you need for study success - in one app.
Get started for freeConsider the family of probability density functions \(\\{h(z ; \theta): \theta
\in \Omega\\}\), where \(h(z ; \theta)=1 / \theta, 0
Let \(X_{1}, X_{2}, \ldots, X_{n}\) denote a random sample of size \(n>1\) from a
distribution with pdf \(f(x ; \theta)=\theta e^{-\theta x}, 0
Let \(X_{1}, X_{2}, \ldots, X_{n}\) denote a random sample from a normal distribution with mean zero and variance \(\theta, 0<\theta<\infty\). Show that \(\sum_{1}^{n} X_{i}^{2} / n\) is an unbiased estimator of \(\theta\) and has variance \(2 \theta^{2} / n\).
Let \(\bar{X}\) denote the mean of the random sample \(X_{1}, X_{2}, \ldots, X_{n}\) from a gammatype distribution with parameters \(\alpha>0\) and \(\beta=\theta \geq 0 .\) Compute \(E\left[X_{1} \mid \bar{x}\right]\). Hint: \(\quad\) Can you find directly a function \(\psi(\bar{X})\) of \(\bar{X}\) such that \(E[\psi(\bar{X})]=\theta ?\) Is \(E\left(X_{1} \mid \bar{x}\right)=\psi(\bar{x}) ?\) Why?
Let \(X_{1}, X_{2}, \ldots, X_{n}\) be iid with the distribution \(N\left(\theta, \sigma^{2}\right),-\infty<\theta<\infty\). Prove that a necessary and sufficient condition that the statistics \(Z=\sum_{1}^{n} a_{i} X_{i}\) and \(Y=\sum_{1}^{n} X_{i}\), a complete sufficient statistic for \(\theta\), are independent is that \(\sum_{1}^{n} a_{i}=0\)
What do you think about this solution?
We value your feedback to improve our textbook solutions.