Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Find the expected number of flips of a coin, which comes up heads with probability \(p\), that are necessary to obtain the pattern \(h, t, h, h, t, h, t, h\).

Short Answer

Expert verified
The total expected number of flips needed to obtain the pattern \(h, t, h, h, t, h, t, h\) is \(\frac{6}{p} + \frac{2}{q} - \frac{1}{pq^2}\), where \(p\) is the probability of flipping a head and \(q=1-p\) is the probability of flipping a tail.

Step by step solution

01

Find the expected number of flips to get the first head

The probability of coming up heads with a flip is p. So, the expected number of flips to get the first head will be the sum of all possible numbers of flips weighted by the probability of flipping a head at each trial. As we know the probability of flipping a head is p, we can use the geometric distribution formula, where the expected value is equal to \(\frac{1}{p}\).
02

Find the expected number of flips to get the first tail after the head

After getting the first head, we need to obtain the first tail. The probability of flipping a tail is denoted by q, where \(q=1-p\). Now, the probability of flipping a tail after the head (given we are in this position) is q. Using the geometric distribution formula again, we find that the expected number of flips needed to get the first tail after the head would be \(\frac{1}{q}\).
03

Find the expected number of flips needed for the rest of the pattern

We repeat the process by calculating the expected number of flips needed for the next part of the pattern: \(h, h, t\). We first flip a head (probability p) and then another head (probability p), and then flip a tail (probability q). In this case, the expected number of flips needed for obtaining this part of the pattern will be \(\frac{1}{pq^2}\). Continuing the pattern, we need to calculate the expected number of flips needed for the remaining part: \(h, t, h\): -Expected flips to obtain the head: \(\frac{1}{p}\) -Expected flips to obtain the tail: \(\frac{1}{q}\) -Expected flips to obtain the final head: \(\frac{1}{p}\)
04

Calculate the total expected number of flips needed

Summing up the expected number of flips needed for each part of the pattern will give us the total expected number of flips: Total expected number of flips = \(\frac{1}{p} + \frac{1}{q} + \frac{1}{pq^2} + \frac{1}{p} + \frac{1}{q} + \frac{1}{p}\) = \(\frac{6}{p} + \frac{2}{q} - \frac{1}{pq^2}\)

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

If \(R_{i}\) denotes the random amount that is earned in period \(i\), then \(\sum_{i=1}^{\infty} \beta^{i-1} R_{i}\), where \(0<\beta<1\) is a specified constant, is called the total discounted reward with discount factor \(\beta .\) Let \(T\) be a geometric random variable with parameter \(1-\beta\) that is independent of the \(R_{i} .\) Show that the expected total discounted reward is equal to the expected total (undiscounted) reward earned by time \(T\). That is, show that $$ E\left[\sum_{i=1}^{\infty} \beta^{i-1} R_{i}\right]=E\left[\sum_{i=1}^{T} R_{i}\right] $$

Let \(X_{i}, i \geqslant 0\) be independent and identically distributed random variables with probability mass function $$ p(j)=P\left[X_{i}=i\right\\}, \quad j=1, \ldots, m, \quad \sum_{j=1}^{m} P(j)=1 $$ Find \(E[N]\), where \(N=\min \left[n>0: X_{n}=X_{0}\right\\}\)

Let \(X_{1}, \ldots, X_{n}\) be independent random variables having a common distribution function that is specified up to an unknown parameter \(\theta\). Let \(T=T(\mathrm{X})\) be a function of the data \(\mathrm{X}=\left(X_{1}, \ldots, X_{n}\right) .\) If the conditional distribution of \(X_{1}, \ldots, X_{n}\) given \(T(\mathrm{X})\) does not depend on \(\theta\) then \(T(\mathrm{X})\) is said to be a sufficient statistic for \(\theta .\) In the following cases, show that \(T(\mathbf{X})=\sum_{i=1}^{n} X_{i}\) is a sufficient statistic for \(\theta\). (a) The \(X_{i}\) are normal with mean \(\theta\) and variance \(1 .\) (b) The density of \(X_{i}\) is \(f(x)=\theta e^{-\theta x}, x>0\). (c) The mass function of \(X_{i}\) is \(p(x)=\theta^{x}(1-\theta)^{1-x}, x=0,1,0<\theta<1\). (d) The \(X_{i}\) are Poisson random variables with mean \(\theta\).

You are invited to a party. Suppose the times at which invitees are independent uniform \((0,1)\) random variables. Suppose that, aside from yourself, the number of other people who are invited is a Poisson random variable with mean \(10 .\) (a) Find the expected number of people who arrive before you. (b) Find the probability that you are the \(n\) h person to arrive.

Suppose \(p(x, y, z)\), the joint probability mass function of the random variables \(X\), \(Y\), and \(Z\), is given by $$ \begin{array}{ll} p(1,1,1)=\frac{1}{8}, & p(2,1,1)=\frac{1}{4} \\ p(1,1,2)=\frac{1}{8}, & p(2,1,2)=\frac{3}{16} \\ p(1,2,1)=\frac{1}{16}, & p(2,2,1)=0 \\ p(1,2,2)=0, & p(2,2,2)=\frac{1}{4} \end{array} $$ \text { What is } E[X \mid Y=2] ? \text { What is } E[X \mid Y=2, Z=1] ?

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free