Problem 5
Let \(X\) be a random variable with the pdf of a regular case of the exponential
class, given by \(f(x ; \theta)=\exp [\theta K(x)+H(x)+q(\theta)], a
Problem 5
Show that the sum of the observations of a random sample of size \(n\) from a
gamma distribution that has pdf \(f(x ; \theta)=(1 / \theta) e^{-x / \theta},
0
Problem 6
Given that \(f(x ; \theta)=\exp [\theta K(x)+H(x)+q(\theta)], a
Problem 6
Let \(X_{1}, X_{2}, \ldots, X_{n}\) denote a random sample from a Poisson distribution with parameter \(\theta, 0<\theta<\infty .\) Let \(Y=\sum_{1}^{n} X_{i}\) and let \(\mathcal{L}[\theta, \delta(y)]=[\theta-\delta(y)]^{2} .\) If we restrict our considerations to decision functions of the form \(\delta(y)=b+y / n\), where \(b\) does not depend on \(y\), show that \(R(\theta, \delta)=b^{2}+\theta / n .\) What decision function of this form yields a uniformly smaller risk than every other decision function of this form? With this solution, say \(\delta\), and \(0<\theta<\infty\), determine \(\max _{\theta} R(\theta, \delta)\) if it exists.
Problem 6
Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from the uniform
distribution with pdf \(f\left(x ; \theta_{1}, \theta_{2}\right)=1 /\left(2
\theta_{2}\right), \theta_{1}-\theta_{2}
Problem 6
Let \(X_{1}, X_{2}, \ldots, X_{5}\) be iid with pdf \(f(x)=e^{-x}, 0
Problem 6
Let a random sample of size \(n\) be taken from a distribution of the discrete type with \(\operatorname{pmf} f(x ; \theta)=1 / \theta, x=1,2, \ldots, \theta\), zero elsewhere, where \(\theta\) is an unknown positive integer. (a) Show that the largest observation, say \(Y\), of the sample is a complete sufficient statistic for \(\theta\). (b) Prove that $$ \left[Y^{n+1}-(Y-1)^{n+1}\right] /\left[Y^{n}-(Y-1)^{n}\right] $$ is the unique MVUE of \(\theta\).
Problem 6
Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from a Poisson distribution with mean \(\theta\). Find the conditional expectation \(E\left(X_{1}+2 X_{2}+3 X_{3} \mid \sum_{1}^{n} X_{i}\right)\).
Problem 7
In the preceding exercise, given that \(E(Y)=E[K(X)]=\theta\), prove that \(Y\) is \(N(\theta, 1)\) Hint: Consider \(M^{\prime}(0)=\theta\) and solve the resulting differential equation.
Problem 7
Let \(X_{1}, X_{2}, \ldots, X_{n}\) denote a random sample from a Poisson distribution with parameter \(\theta>0\). From Remark 7.6.1, we know that \(E\left[(-1)^{X_{1}}\right]=e^{-2 \theta}\). (a) Show that \(E\left[(-1)^{X_{1}} \mid Y_{1}=y_{1}\right]=(1-2 / n)^{y_{1}}\), where \(Y_{1}=X_{1}+X_{2}+\cdots+X_{n}\). Hint: First show that the conditional pdf of \(X_{1}, X_{2}, \ldots, X_{n-1}\), given \(Y_{1}=y_{1}\), is multinomial, and hence that of \(X_{1}\), given \(Y_{1}=y_{1}\), is \(b\left(y_{1}, 1 / n\right)\). (b) Show that the mle of \(e^{-2 \theta}\) is \(e^{-2 \bar{X}}\). (c) Since \(y_{1}=n \bar{x}\), show that \((1-2 / n)^{y_{1}}\) is approximately equal to \(e^{-2 \pi}\) when \(n\) is large.