Chapter 3: Problem 19
Prove that if \(X\) and \(Y\) are jointly continuous, then $$ E[X]=\int_{-\infty}^{\infty} E[X \mid Y=y] f_{Y}(y) d y $$
Chapter 3: Problem 19
Prove that if \(X\) and \(Y\) are jointly continuous, then $$ E[X]=\int_{-\infty}^{\infty} E[X \mid Y=y] f_{Y}(y) d y $$
All the tools & learning materials you need for study success - in one app.
Get started for freeLet \(X_{1}\) and \(X_{2}\) be independent geometric random variables having the same parameter \(p\). Guess the value of $$ P\left\\{X_{1}=i \mid X_{1}+X_{2}=n\right\\} $$ Hint: Suppose a coin having probability \(p\) of coming up heads is continually flipped. If the second head occurs on flip number \(n\), what is the conditional probability that the first head was on flip number \(i, i=1, \ldots, n-1 ?\) Verify your guess analytically.
Consider a large population of families, and suppose that the number of children in the different families are independent Poisson random variables with mean \(\lambda\). Show that the number of siblings of a randomly chosen child is also Poisson distributed with mean \(\lambda\).
Show that (a) \(\quad E[X Y \mid Y=y]=y E[X \mid Y=y]\) (b) \(E[g(X, Y) \mid Y=y]=E[g(X, y) \mid Y=y]\) (c) \(E[X Y]=E[Y E[X \mid Y]]\)
You have two opponents with whom you alternate play. Whenever you play \(A\), you win with probability \(p_{A}\); whenever you play \(B\), you win with probability \(p_{B}\), where \(p_{B}>p_{A}\). If your objective is to minimize the expected number of games you need to play to win two in a row, should you start with \(A\) or with \(B\) ? Hint: Let \(E\left[N_{i}\right]\) denote the mean number of games needed if you initially play \(i\). Derive an expression for \(E\left[N_{A}\right]\) that involves \(E\left[N_{B}\right] ;\) write down the equivalent expression for \(E\left[N_{B}\right]\) and then subtract.
If \(X_{i}, i=1, \ldots, n\) are independent normal random variables, with \(X_{i}\) having mean \(\mu_{i}\) and variance 1, then the random variable \(\sum_{i=1}^{n} X_{i}^{2}\) is said to be a noncentral chi-squared random variable. (a) if \(X\) is a normal random variable having mean \(\mu\) and variance 1 show, for \(|t|<1 / 2\), that the moment generating function of \(X^{2}\) is $$ (1-2 t)^{-1 / 2} e^{\frac{t \mu^{2}}{1-2 t}} $$ (b) Derive the moment generating function of the noncentral chi-squared random variable \(\sum_{i=1}^{n} X_{i}^{2}\), and show that its distribution depends on the sequence of means \(\mu_{1}, \ldots, \mu_{n}\) only through the sum of their squares. As a result, we say that \(\sum_{i=1}^{n} X_{i}^{2}\) is a noncentral chi-squared random variable with parameters \(n\) and \(\theta=\sum_{i=1}^{n} \mu_{i}^{2}\) (c) If all \(\mu_{i}=0\), then \(\sum_{i=1}^{n} X_{i}^{2}\) is called a chi- squared random variable with \(n\) degrees of freedom. Determine, by differentiating its moment generating function, its expected value and variance. (d) Let \(K\) be a Poisson random variable with mean \(\theta / 2\), and suppose that conditional on \(K=k\), the random variable \(W\) has a chi-squared distribution with \(n+2 k\) degrees of freedom. Show, by computing its moment generating function, that \(W\) is a noncentral chi-squared random variable with parameters \(n\) and \(\theta\). (e) Find the expected value and variance of a noncentral chi-squared random variable with parameters \(n\) and \(\theta\).
What do you think about this solution?
We value your feedback to improve our textbook solutions.