Chapter 3: Problem 12
The joint density of \(X\) and \(Y\) is given by
$$
f(x, y)=\frac{e^{-x / y} e^{-y}}{y}, \quad 0
Chapter 3: Problem 12
The joint density of \(X\) and \(Y\) is given by
$$
f(x, y)=\frac{e^{-x / y} e^{-y}}{y}, \quad 0
All the tools & learning materials you need for study success - in one app.
Get started for freeLet \(X_{1}, \ldots, X_{n}\) be independent random variables having a common distribution function that is specified up to an unknown parameter \(\theta\). Let \(T=T(\mathrm{X})\) be a function of the data \(\mathrm{X}=\left(X_{1}, \ldots, X_{n}\right) .\) If the conditional distribution of \(X_{1}, \ldots, X_{n}\) given \(T(\mathrm{X})\) does not depend on \(\theta\) then \(T(\mathrm{X})\) is said to be a sufficient statistic for \(\theta .\) In the following cases, show that \(T(\mathbf{X})=\sum_{i=1}^{n} X_{i}\) is a sufficient statistic for \(\theta\). (a) The \(X_{i}\) are normal with mean \(\theta\) and variance \(1 .\) (b) The density of \(X_{i}\) is \(f(x)=\theta e^{-\theta x}, x>0\). (c) The mass function of \(X_{i}\) is \(p(x)=\theta^{x}(1-\theta)^{1-x}, x=0,1,0<\theta<1\). (d) The \(X_{i}\) are Poisson random variables with mean \(\theta\).
Two players take turns shooting at a target, with each shot by player \(i\) hitting the target with probability \(p_{i}, i=1,2\). Shooting ends when two consecutive shots hit the target. Let \(\mu_{i}\) denote the mean number of shots taken when player \(i\) shoots first, \(i=1,2\) (a) Find \(\mu_{1}\) and \(\mu_{2}\). (b) Let \(h_{i}\) denote the mean number of times that the target is hit when player \(i\) shoots first, \(i=1,2\). Find \(h_{1}\) and \(h_{2}\).
In the list example of Section \(3.6 .1\) suppose that the initial ordering at time \(t=0\) is determined completely at random; that is, initially all \(n !\) permutations are equally likely. Following the front-of-the-line rule, compute the expected position of the element requested at time \(t\). Hint: To compute \(P\left\\{e_{j}\right.\) precedes \(e_{i}\) at time \(\left.t\right\\}\) condition on whether or not either \(e_{i}\) or \(e_{j}\) has ever been requested prior to \(t\).
Find the expected number of flips of a coin, which comes up heads with probability \(p\), that are necessary to obtain the pattern \(h, t, h, h, t, h, t, h\).
Suppose that we want to predict the value of a random variable \(X\) by using one of the predictors \(Y_{1}, \ldots, Y_{n}\), each of which satisfies \(E\left[Y_{i} \mid X\right]=X .\) Show that the predictor \(Y_{i}\) that minimizes \(E\left[\left(Y_{i}-X\right)^{2}\right]\) is the one whose variance is smallest. Hint: Compute \(\operatorname{Var}\left(Y_{i}\right)\) by using the conditional variance formula.
What do you think about this solution?
We value your feedback to improve our textbook solutions.