Chapter 3: Problem 91
Find the expected number of flips of a coin, which comes up heads with probability \(p\), that are necessary to obtain the pattern \(h, t, h, h, t, h, t, h\).
Chapter 3: Problem 91
Find the expected number of flips of a coin, which comes up heads with probability \(p\), that are necessary to obtain the pattern \(h, t, h, h, t, h, t, h\).
All the tools & learning materials you need for study success - in one app.
Get started for freeIf \(R_{i}\) denotes the random amount that is earned in period \(i\), then \(\sum_{i=1}^{\infty} \beta^{i-1} R_{i}\), where \(0<\beta<1\) is a specified constant, is called the total discounted reward with discount factor \(\beta .\) Let \(T\) be a geometric random variable with parameter \(1-\beta\) that is independent of the \(R_{i} .\) Show that the expected total discounted reward is equal to the expected total (undiscounted) reward earned by time \(T\). That is, show that $$ E\left[\sum_{i=1}^{\infty} \beta^{i-1} R_{i}\right]=E\left[\sum_{i=1}^{T} R_{i}\right] $$
Let \(X_{i}, i \geqslant 0\) be independent and identically distributed random variables with probability mass function $$ p(j)=P\left[X_{i}=i\right\\}, \quad j=1, \ldots, m, \quad \sum_{j=1}^{m} P(j)=1 $$ Find \(E[N]\), where \(N=\min \left[n>0: X_{n}=X_{0}\right\\}\)
Let \(X_{1}, \ldots, X_{n}\) be independent random variables having a common distribution function that is specified up to an unknown parameter \(\theta\). Let \(T=T(\mathrm{X})\) be a function of the data \(\mathrm{X}=\left(X_{1}, \ldots, X_{n}\right) .\) If the conditional distribution of \(X_{1}, \ldots, X_{n}\) given \(T(\mathrm{X})\) does not depend on \(\theta\) then \(T(\mathrm{X})\) is said to be a sufficient statistic for \(\theta .\) In the following cases, show that \(T(\mathbf{X})=\sum_{i=1}^{n} X_{i}\) is a sufficient statistic for \(\theta\). (a) The \(X_{i}\) are normal with mean \(\theta\) and variance \(1 .\) (b) The density of \(X_{i}\) is \(f(x)=\theta e^{-\theta x}, x>0\). (c) The mass function of \(X_{i}\) is \(p(x)=\theta^{x}(1-\theta)^{1-x}, x=0,1,0<\theta<1\). (d) The \(X_{i}\) are Poisson random variables with mean \(\theta\).
You are invited to a party. Suppose the times at which invitees are independent uniform \((0,1)\) random variables. Suppose that, aside from yourself, the number of other people who are invited is a Poisson random variable with mean \(10 .\) (a) Find the expected number of people who arrive before you. (b) Find the probability that you are the \(n\) h person to arrive.
Suppose \(p(x, y, z)\), the joint probability mass function of the random variables \(X\), \(Y\), and \(Z\), is given by $$ \begin{array}{ll} p(1,1,1)=\frac{1}{8}, & p(2,1,1)=\frac{1}{4} \\ p(1,1,2)=\frac{1}{8}, & p(2,1,2)=\frac{3}{16} \\ p(1,2,1)=\frac{1}{16}, & p(2,2,1)=0 \\ p(1,2,2)=0, & p(2,2,2)=\frac{1}{4} \end{array} $$ \text { What is } E[X \mid Y=2] ? \text { What is } E[X \mid Y=2, Z=1] ?
What do you think about this solution?
We value your feedback to improve our textbook solutions.