Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Prove that if \(X\) and \(Y\) are jointly continuous, then $$ E[X]=\int_{-\infty}^{\infty} E[X \mid Y=y] f_{Y}(y) d y $$

Short Answer

Expert verified
To prove the given formula, first calculate the marginal probability density function of \(Y\), \(f_Y(y)\), by integrating the joint probability density function of \(X\) and \(Y\), \(f_{X,Y}(x,y)\), with respect to \(x\). Next, express the conditional probability density function of \(X\) given \(Y = y\), \(f_{X \mid Y}(x \mid y)\), using the joint probability density function and the marginal probability density function of \(Y\). Then, calculate the conditional expectation of \(X\) given \(Y = y\), \(E[X \mid Y=y]\), by integrating the product of \(x\) and the conditional probability density function of \(X\) given \(Y = y\). Finally, compute the expected value of \(X\), \(E[X]\), by integrating the product of the conditional expectation and the marginal probability density function of \(Y\). The resulting expression is \(E[X] = \int_{-\infty}^{\infty} E[X \mid Y=y] f_{Y}(y) dy\).

Step by step solution

01

Calculate the marginal probability density function of Y

We will begin by calculating the marginal probability density function \(f_Y(y)\). This is found by integrating the joint probability density function with respect to \(x\), i.e., \(f_Y(y) = \int_{-\infty}^{\infty} f_{X,Y}(x, y) dx \).
02

Express the conditional probability density function of X given Y = y

Next, we express the conditional probability density function \(f_{X \mid Y}(x \mid y)\) using the joint probability density function and the marginal probability density function of \(Y\). This is given by the following formula: \(f_{X \mid Y}(x \mid y) = \frac{f_{X,Y}(x,y)}{f_Y(y)}\).
03

Calculate the conditional expectation of X given Y = y

Now we will calculate the conditional expectation of \(X\) given \(Y = y\). This is computed by integrating the product of \(x\) and the conditional probability density function of \(X\) given \(Y = y\). Formally, this can be written as: \(E[X \mid Y=y] = \int_{-\infty}^{\infty} x \cdot f_{X \mid Y}(x \mid y) dx\).
04

Compute the expected value of X

Finally, we will compute the expected value of \(X\), by integrating the product of the conditional expectation of \(X\) given \(Y = y\) and the marginal probability density function of \(Y\). In other words, we will prove that: \(E[X] = \int_{-\infty}^{\infty} E[X \mid Y=y] f_{Y}(y) dy\). We will need to substitute the expression for the conditional expectation and marginal probability density function that we derived in the previous steps. \(E[X] = \int_{-\infty}^{\infty} \left( \int_{-\infty}^{\infty} x \cdot \frac{f_{X,Y}(x,y)}{f_Y(y)} dx \right) \left( \int_{-\infty}^{\infty} f_{X,Y}(x, y) dx \right) dy\). Now, notice that we can simplify the above expression and cancel out the marginal probability density function of \(Y\). \(E[X] = \int_{-\infty}^{\infty} \int_{-\infty}^{\infty} x \cdot f_{X,Y}(x,y) dx dy\). This is the expected value of \(X\), calculated by integrating the product of \(x\) and the joint probability density function of \(X\) and \(Y\). Thus, we have successfully proved the given formula, \(E[X] = \int_{-\infty}^{\infty} E[X \mid Y=y] f_{Y}(y) dy\).

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Let \(X_{1}\) and \(X_{2}\) be independent geometric random variables having the same parameter \(p\). Guess the value of $$ P\left\\{X_{1}=i \mid X_{1}+X_{2}=n\right\\} $$ Hint: Suppose a coin having probability \(p\) of coming up heads is continually flipped. If the second head occurs on flip number \(n\), what is the conditional probability that the first head was on flip number \(i, i=1, \ldots, n-1 ?\) Verify your guess analytically.

Consider a large population of families, and suppose that the number of children in the different families are independent Poisson random variables with mean \(\lambda\). Show that the number of siblings of a randomly chosen child is also Poisson distributed with mean \(\lambda\).

Show that (a) \(\quad E[X Y \mid Y=y]=y E[X \mid Y=y]\) (b) \(E[g(X, Y) \mid Y=y]=E[g(X, y) \mid Y=y]\) (c) \(E[X Y]=E[Y E[X \mid Y]]\)

You have two opponents with whom you alternate play. Whenever you play \(A\), you win with probability \(p_{A}\); whenever you play \(B\), you win with probability \(p_{B}\), where \(p_{B}>p_{A}\). If your objective is to minimize the expected number of games you need to play to win two in a row, should you start with \(A\) or with \(B\) ? Hint: Let \(E\left[N_{i}\right]\) denote the mean number of games needed if you initially play \(i\). Derive an expression for \(E\left[N_{A}\right]\) that involves \(E\left[N_{B}\right] ;\) write down the equivalent expression for \(E\left[N_{B}\right]\) and then subtract.

If \(X_{i}, i=1, \ldots, n\) are independent normal random variables, with \(X_{i}\) having mean \(\mu_{i}\) and variance 1, then the random variable \(\sum_{i=1}^{n} X_{i}^{2}\) is said to be a noncentral chi-squared random variable. (a) if \(X\) is a normal random variable having mean \(\mu\) and variance 1 show, for \(|t|<1 / 2\), that the moment generating function of \(X^{2}\) is $$ (1-2 t)^{-1 / 2} e^{\frac{t \mu^{2}}{1-2 t}} $$ (b) Derive the moment generating function of the noncentral chi-squared random variable \(\sum_{i=1}^{n} X_{i}^{2}\), and show that its distribution depends on the sequence of means \(\mu_{1}, \ldots, \mu_{n}\) only through the sum of their squares. As a result, we say that \(\sum_{i=1}^{n} X_{i}^{2}\) is a noncentral chi-squared random variable with parameters \(n\) and \(\theta=\sum_{i=1}^{n} \mu_{i}^{2}\) (c) If all \(\mu_{i}=0\), then \(\sum_{i=1}^{n} X_{i}^{2}\) is called a chi- squared random variable with \(n\) degrees of freedom. Determine, by differentiating its moment generating function, its expected value and variance. (d) Let \(K\) be a Poisson random variable with mean \(\theta / 2\), and suppose that conditional on \(K=k\), the random variable \(W\) has a chi-squared distribution with \(n+2 k\) degrees of freedom. Show, by computing its moment generating function, that \(W\) is a noncentral chi-squared random variable with parameters \(n\) and \(\theta\). (e) Find the expected value and variance of a noncentral chi-squared random variable with parameters \(n\) and \(\theta\).

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free