Chapter 7: Problem 6
Let \(X_{1}, X_{2}, \ldots, X_{5}\) be iid with pdf \(f(x)=e^{-x}, 0
Chapter 7: Problem 6
Let \(X_{1}, X_{2}, \ldots, X_{5}\) be iid with pdf \(f(x)=e^{-x}, 0
All the tools & learning materials you need for study success - in one app.
Get started for freeLet a random sample of size \(n\) be taken from a distribution that has the pdf \(f(x ; \theta)=(1 / \theta) \exp (-x / \theta) I_{(0, \infty)}(x) .\) Find the mle and the MVUE of \(P(X \leq 2)\)
Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample with the common pdf \(f(x)=\) \(\theta^{-1} e^{-x / \theta}\), for \(x>0\), zero elsewhere; that is, \(f(x)\) is a \(\Gamma(1, \theta)\) pdf. (a) Show that the statistic \(\bar{X}=n^{-1} \sum_{i=1}^{n} X_{i}\) is a complete and sufficient statistic for \(\theta\). (b) Determine the MVUE of \(\theta\). (c) Determine the mle of \(\theta\). (d) Often, though, this pdf is written as \(f(x)=\tau e^{-\tau x}\), for \(x>0\), zero elsewhere. Thus \(\tau=1 / \theta\). Use Theorem \(6.1 .2\) to determine the mle of \(\tau\). (e) Show that the statistic \(\bar{X}=n^{-1} \sum_{i=1}^{n} X_{i}\) is a complete and sufficient statistic for \(\tau\). Show that \((n-1) /(n X)\) is the MVUE of \(\tau=1 / \theta\). Hence, as usual the reciprocal of the mle of \(\theta\) is the mle of \(1 / \theta\), but, in this situation, the reciprocal of the MVUE of \(\theta\) is not the MVUE of \(1 / \theta\). (f) Compute the variances of each of the unbiased estimators in Parts (b) and (e).
Let \(X_{1}, \ldots, X_{n}\) be iid with pdf \(f(x ; \theta)=1 /(3
\theta),-\theta
Let \(X_{1}, X_{2}, \ldots, X_{n}\) denote a random sample from a distribution that is, \(N(\mu, \theta), 0<\theta<\infty\), where \(\mu\) is unknown. Let \(Y=\sum_{1}^{n}\left(X_{i}-\bar{X}\right)^{2} / n=V\) and let \(\mathcal{L}[\theta, \delta(y)]=[\theta-\delta(y)]^{2}\). If we consider decision functions of the form \(\delta(y)=b y\), where \(b\) does not depend upon \(y\), show that \(R(\theta, \delta)=\left(\theta^{2} / n^{2}\right)\left[\left(n^{2}-1\right) b^{2}-2 n(n-1) b+n^{2}\right]\). Show that \(b=n /(n+1)\) yields a minimum risk decision functions of this form. Note that \(n Y /(n+1)\) is not an unbiased estimator of \(\theta\). With \(\delta(y)=n y /(n+1)\) and \(0<\theta<\infty\), determine \(\max _{\theta} R(\theta, \delta)\) if it exists.
Let \(X_{1}, X_{2}, \ldots, X_{n}\) be iid \(N(0, \theta), 0<\theta<\infty\) Show that \(\sum_{1}^{n} X_{i}^{2}\) is a sufficient statistic for \(\theta\).
What do you think about this solution?
We value your feedback to improve our textbook solutions.