Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

A coin, having probability \(p\) of landing heads, is continually flipped until at least one head and one tail have been flipped. (a) Find the expected number of flips needed. (b) Find the expected number of flips that land on heads. (c) Find the expected number of flips that land on tails. (d) Repeat part (a) in the case where flipping is continued until a total of at least two heads and one tail have been flipped.

Short Answer

Expert verified
The short answers to each part of the question are: (a) The expected number of flips needed until at least one head and one tail have been flipped is \(E(N) = \frac{1}{2p(1-p)}\). (b) The expected number of flips that land heads is \(\frac{1}{2(1-p)}\). (c) The expected number of flips that land tails is \(\frac{1}{2p}\). (d) The expected number of flips needed until at least two heads and one tail have been flipped is \(E(N) + E(M) = \frac{1}{2p(1-p)} + \frac{1}{p}\).

Step by step solution

01

(a) Expected number of flips needed

We can denote the expected number of flips needed as E(N). We can also split this into two cases: the first flip is heads (with probability p) and the first flip is tails (with probability 1-p). After the first flip, we are essentially waiting for the other side to appear. So, we can write the expected value as: E(N) = p(E(N | First flip is heads)) + (1-p)(E(N | First flip is tails)) Now, let's consider the two conditional expected values: E(N | First flip is heads) = 1 + (1-p)E(N), since the first flip is heads and then we expect to need (1-p) more flips before getting a tail. E(N | First flip is tails) = 1 + pE(N), since the first flip is tails and then we expect to need p more flips before getting a head. Now we can plug these values back into the original equation: E(N) = p(1 + (1-p)E(N)) + (1-p)(1 + pE(N)) Now we can solve for E(N): E(N) = 1 + p(1-p)E(N)+ (1-p)(pE(N)) E(N)(1 - p(1-p) - (1-p)p) = 1 E(N) = \(\frac{1}{1 - p(1-p) - (1-p)p}\) E(N) = \(\frac{1}{2p(1-p)}\)
02

(b) Expected number of flips that land heads

Now, let's consider the ratio of heads to total flips: Ratio of heads = \(\frac{p \cdot E(N)}{E(N)}\) Expected number of flips that land heads = p * E(N) = \(\frac{p}{2p(1-p)}\) = \(\frac{1}{2(1-p)}\)
03

(c) Expected number of flips that land tails

Similarly, the expected number of flips that land tails can be calculated as: Expected number of flips that land tails = (1-p) * E(N) = \(\frac{1-p}{2p(1-p)}\) = \(\frac{1}{2p}\)
04

(d) Expected flips until two heads and one tail

Now, we need to find the expected number of flips needed to get at least two heads and one tail. Let E(M) be the expected number of flips needed to get one more head after getting one head and one tail. E(M) = p(1) + (1-p)(1 + E(M)) E(M) = \(\frac{1}{p}\) Now we can find the expected number of flips needed for two heads and one tail case: Expected number of flips = E(N) + E(M) = \(\frac{1}{2p(1-p)} + \frac{1}{p}\)

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

A coin having probability \(p\) of coming up heads is continually flipped. Let \(P_{j}(n)\) denote the probability that a run of \(j\) successive heads occurs within the first \(n\) flips. (a) Argue that $$ P_{j}(n)=P_{j}(n-1)+p^{j}(1-p)\left[1-P_{j}(n-j-1)\right] $$ (b) By conditioning on the first non-head to appear, derive another equation relating \(P_{j}(n)\) to the quantities \(P_{j}(n-k), k=1, \ldots, j\)

The number of red balls in an urn that contains \(n\) balls is a random variable that is equally likely to be any of the values \(0,1, \ldots, n\). That is, $$ P\\{i \text { red, } n-i \text { non-red }\\}=\frac{1}{n+1}, \quad i=0, \ldots, n $$ The \(n\) balls are then randomly removed one at a time. Let \(Y_{k}\) denote the number of red balls in the first \(k\) selections, \(k=1, \ldots, n\) (a) Find \(P\left\\{Y_{n}=j\right\\}, j=0, \ldots, n\). (b) Find \(P\left\\{Y_{n-1}=j\right\\}, j=0, \ldots, n\) (c) What do you think is the value of \(P\left\\{Y_{k}=j\right\\}, j=0, \ldots, n ?\) (d) Verify your answer to part (c) by a backwards induction argument. That is, check that your answer is correct when \(k=n\), and then show that whenever it is true for \(k\) it is also true for \(k-1, k=1, \ldots, n\).

The number of accidents in each period is a Poisson random variable with mean \(5 .\) With \(X_{n}, n \geqslant 1\), equal to the number of accidents in period \(n\), find \(E[N]\) when (a) \(\quad N=\min \left(n: X_{n-2}=2, X_{n-1}=1, X_{n}=0\right)\) (b) \(\quad N=\min \left(n: X_{n-3}=2, X_{n-2}=1, X_{n-1}=0, X_{n}=2\right)\).

An urn contains \(n\) white and \(m\) black balls that are removed one at a time. If \(n>m\), show that the probability that there are always more white than black balls in the urn (until, of course, the urn is empty) equals ( \(n-m) /(n+m)\). Explain why this probability is equal to the probability that the set of withdrawn balls always contains more white than black balls. (This latter probability is \((n-m) /(n+m)\) by the ballot problem.)

If \(X_{i}, i=1, \ldots, n\) are independent normal random variables, with \(X_{i}\) having mean \(\mu_{i}\) and variance 1, then the random variable \(\sum_{i=1}^{n} X_{i}^{2}\) is said to be a noncentral chi-squared random variable. (a) if \(X\) is a normal random variable having mean \(\mu\) and variance 1 show, for \(|t|<1 / 2\), that the moment generating function of \(X^{2}\) is $$ (1-2 t)^{-1 / 2} e^{\frac{t \mu^{2}}{1-2 t}} $$ (b) Derive the moment generating function of the noncentral chi-squared random variable \(\sum_{i=1}^{n} X_{i}^{2}\), and show that its distribution depends on the sequence of means \(\mu_{1}, \ldots, \mu_{n}\) only through the sum of their squares. As a result, we say that \(\sum_{i=1}^{n} X_{i}^{2}\) is a noncentral chi-squared random variable with parameters \(n\) and \(\theta=\sum_{i=1}^{n} \mu_{i}^{2}\) (c) If all \(\mu_{i}=0\), then \(\sum_{i=1}^{n} X_{i}^{2}\) is called a chi- squared random variable with \(n\) degrees of freedom. Determine, by differentiating its moment generating function, its expected value and variance. (d) Let \(K\) be a Poisson random variable with mean \(\theta / 2\), and suppose that conditional on \(K=k\), the random variable \(W\) has a chi-squared distribution with \(n+2 k\) degrees of freedom. Show, by computing its moment generating function, that \(W\) is a noncentral chi-squared random variable with parameters \(n\) and \(\theta\). (e) Find the expected value and variance of a noncentral chi-squared random variable with parameters \(n\) and \(\theta\).

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free