Chapter 7: Problem 7
Let \(X\) have the pdf \(f_{X}(x ; \theta)=1 /(2 \theta)\), for
\(-\theta
Chapter 7: Problem 7
Let \(X\) have the pdf \(f_{X}(x ; \theta)=1 /(2 \theta)\), for
\(-\theta
All the tools & learning materials you need for study success - in one app.
Get started for freeLet \(X_{1}, X_{2}, \ldots, X_{n}\) denote a random sample from a distribution that is \(N(\mu, \theta), 0<\theta<\infty\), where \(\mu\) is unknown. Let \(Y=\sum_{1}^{n}\left(X_{i}-\bar{X}\right)^{2} / n\) and let \(\mathcal{L}[\theta, \delta(y)]=[\theta-\delta(y)]^{2}\). If we consider decision functions of the form \(\delta(y)=b y\), where \(b\) does not depend upon \(y\), show that \(R(\theta, \delta)=\left(\theta^{2} / n^{2}\right)\left[\left(n^{2}-1\right) b^{2}-2 n(n-1) b+n^{2}\right]\). Show that \(b=n /(n+1)\) yields a minimum risk decision function of this form. Note that \(n Y /(n+1)\) is not an unbiased estimator of \(\theta\). With \(\delta(y)=n y /(n+1)\) and \(0<\theta<\infty\), determine \(\max _{\theta} R(\theta, \delta)\) if it exists.
Let \(Y_{1}
Show that the mean \(\bar{X}\) of a random sample of size \(n\) from a
distribution having pdf \(f(x ; \theta)=(1 / \theta) e^{-(x / \theta)},
0
Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from each of the following distributions involving the parameter \(\theta .\) In each case find the mle of \(\theta\) and show that it is a sufficient statistic for \(\theta\) and hence a minimal sufficient statistic. (a) \(b(1, \theta)\), where \(0 \leq \theta \leq 1\). (b) Poisson with mean \(\theta>0\). (c) Gamma with \(\alpha=3\) and \(\beta=\theta>0\). (d) \(N(\theta, 1)\), where \(-\infty<\theta \leq \infty\). (e) \(N(0, \theta)\), where \(0<\theta<\infty\).
Let \(X_{1}, X_{2}, \ldots, X_{n}\) denote a random sample from a normal distribution with mean zero and variance \(\theta, 0<\theta<\infty\). Show that \(\sum_{1}^{n} X_{i}^{2} / n\) is an unbiased estimator of \(\theta\) and has variance \(2 \theta^{2} / n\).
What do you think about this solution?
We value your feedback to improve our textbook solutions.