Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Suppose \(X\) and \(Y\) are independent continuous random variables. Show that $$ E[X \mid Y=y]=E[X] \text { for all } y $$

Short Answer

Expert verified
Using the definition of conditional expectation and the property of independence between random variables X and Y, we can find the conditional probability density function of X given Y = y. We then substitute this back into the definition of conditional expectation and recognize that the integral in the expression is the expected value of X. Thus, for independent continuous random variables X and Y, the expected value of X given Y equals y is the same as the expected value of X: \(E[X \mid Y=y] = E[X]\). This holds for all y.

Step by step solution

01

Write down the definition of conditional expectation

To find the conditional expectation of the random variable X given Y equals y, we can use the definition: \(E[X \mid Y=y] = \int_{-\infty}^{\infty} x \cdot f_{X \mid Y}(x \mid y) dx\) where \(f_{X \mid Y}(x \mid y)\) is the conditional probability density function of X given Y = y.
02

Use the property of independence

Since X and Y are independent random variables, their joint probability density function can be expressed as the product of their marginal probability density functions: \(f_{X,Y}(x,y) = f_X(x) \cdot f_Y(y)\) As a consequence, the conditional probability density function of X given Y = y is: \(f_{X \mid Y}(x \mid y) = \frac{f_{X,Y}(x,y)}{f_Y(y)} = \frac{f_X(x) \cdot f_Y(y)}{f_Y(y)}\) Since \(f_Y(y) \neq 0\), we can simplify the expression: \(f_{X \mid Y}(x \mid y) = f_X(x)\)
03

Substitute the conditional PDF

Now that we have the conditional probability density function for independent random variables X and Y, we can substitute it back into the definition of the conditional expectation: \(E[X \mid Y=y] = \int_{-\infty}^{\infty} x \cdot f_{X \mid Y}(x \mid y) dx = \int_{-\infty}^{\infty} x \cdot f_X(x) dx\)
04

Recognize the expected value of X

The integral in the last expression represents the expected value of the random variable X: \(E[X] = \int_{-\infty}^{\infty} x \cdot f_X(x) dx\)
05

Make the conclusion

Since the integral in Step 3 is equal to the expected value of X, we can conclude that: \(E[X \mid Y=y] = E[X]\) This holds for all y, as required. Therefore, the expected value of X given Y equals y is the same as the expected value of X for independent random variables X and Y.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Let \(Y\) be a gamma random variable with parameters \((s, \alpha) .\) That is, its density is $$ f_{Y}(y)=C e^{-\alpha y} y^{s-1}, \quad y>0 $$ where \(C\) is a constant that does not depend on \(y .\) Suppose also that the conditional distribution of \(X\) given that \(Y=y\) is Poisson with mean \(y\). That is, $$ P\\{X=i \mid Y=y\\}=e^{-y} y^{i} / i !, \quad i \geqslant 0 $$ Show that the conditional distribution of \(Y\) given that \(X=i\) is the gamma distribution with parameters (s \(+i, \alpha+1\) ).

Find the expected number of flips of a coin, which comes up heads with probability \(p\), that are necessary to obtain the pattern \(h, t, h, h, t, h, t, h\).

A coin, having probability \(p\) of landing heads, is continually flipped until at least one head and one tail have been flipped. (a) Find the expected number of flips needed. (b) Find the expected number of flips that land on heads. (c) Find the expected number of flips that land on tails. (d) Repeat part (a) in the case where flipping is continued until a total of at least two heads and one tail have been flipped.

An individual traveling on the real line is trying to reach the origin. However, the larger the desired step, the greater is the variance in the result of that step. Specifically, whenever the person is at location \(x\), he next moves to a location having mean 0 and variance \(\beta x^{2}\). Let \(X_{n}\) denote the position of the individual after having taken \(n\) steps. Supposing that \(X_{0}=x_{0}\), find (a) \(E\left[X_{n}\right]\); (b) \(\operatorname{Var}\left(X_{n}\right)\)

Data indicate that the number of traffic accidents in Berkeley on a rainy day is a Poisson random variable with mean 9 , whereas on a dry day it is a Poisson random variable with mean \(3 .\) Let \(X\) denote the number of traffic accidents tomorrow. If it will rain tomorrow with probability \(0.6\), find (a) \(E[X]\); (b) \(P[X=0\\}\) (c) \(\operatorname{Var}(X)\)

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free