Problem 16
A professor decides that the method proposed for computing the maximum list size is much too complicated. He proposes the following solution: If we let \(X_{i}\) be the size of list \(i\), then what we want to compute is \(E\left(\max _{i}\left(X_{i}\right)\right)\). This means $$ E\left(\max _{i}\left(X_{i}\right)\right)=\max _{i}\left(E\left(X_{i}\right)\right)=\max _{i}(1)=1 . $$ What is the flaw in his solution?
Problem 16
Two nickels, two dimes, and two quarters are in a cup. You draw three coins, one after the other, without replacement. What is the expected amount of money you draw on the first draw? On the second draw? What is the expected value of the total amount of money you draw? Does this expected value change if you draw the three coins all at once?
Problem 17
Evaluate the sum $$ \sum_{i=0}^{10} i\left(\begin{array}{c} 10 \\ i \end{array}\right)(.9)^{i}(.1)^{10-i} $$ which arose in computing the expected number of right answers a person would have on a 10 -question test with probability \(.9\) of answering each question correctly. First, use the binomial theorem and calculus to show that $$ 10(.1+x)^{9}=\sum_{i=0}^{10} i\left(\begin{array}{c} 10 \\ i \end{array}\right)(.1)^{10-i} x^{i-1} $$ Substituting \(x=.9\) almost gives the sum you want on the right side of the equation, except that in every term of the sum, the power on 9 is one too small. Use some simple algebra to fix this and then explain why the expected number of right answers is 9 .
Problem 18
Prove as tight upper and lower bounds as you can for \(\sum_{i=1}^{k}(1 / i)\). For this purpose, it is useful to remember the definition of the natural logarithm as an integral involving \(1 / x\) and to draw rectangles and other geometric figures above and below the curve.
Problem 18
Give an example of two random variables \(X\) and \(Y\) such that \(E(X Y) \neq E(X) E(Y)\). Here \(X Y\) is the random variable with \((X Y)(s)=X(s) Y(s)\)
Problem 19
Give an example of two random variables \(X\) and \(Y\) such that \(E(X Y) \neq E(X) E(Y)\). Here \(X Y\) is the random variable with \((X Y)(s)=X(s) Y(s)\)
Problem 20
Use calculus and the sum of a geometric series to show that if \(-1<\) \(x<1\), then $$ \sum_{j=0}^{\infty} j x^{j}=\frac{x}{(1-x)^{2}} $$ as in Equation \(5.29 .\)
Problem 20
Another way to bound the deviance from the expectation is known as Markov's inequality, which says that if \(X\) is a random variable taking only nonnegative values, then $$ P(X>k E(X)) \leq \frac{1}{k} $$ for any \(k \geq 1\). Prove this inequality.
Problem 21
Give an example of a random variable on the sample space \(\left\\{S, F S, F F S, \ldots, F^{i} S, \ldots\right\\}\) with an infinite expected value, using a geometric distribution for probabilities of \(F^{i} S\).