Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Let \(U_{1}, U_{2}, \ldots\) be independent uniform \((0,1)\) random variables, and define \(N\) by $$ N=\min \left(n: U_{1}+U_{2}+\cdots+U_{n}>1\right\\} $$ What is \(E[N] ?\)

Short Answer

Expert verified
The expected value of \(N\), denoted as \(E[N]\), is infinite. It can be found by summing the weighted probabilities for all integer values of \(n \geq 1\). Although this result is counter-intuitive, mathematically the expectation is infinite, demonstrating that we cannot expect a finite number of uniformly distributed random variables to sum to more than 1.

Step by step solution

01

Probability of N

First, we need to find the general expression for the probability that N is equal to a specific value, say \(n\). We look for the probability that the sum exceeds 1 after adding the \(n^{th}\) uniform random variable but not before: \(P(U_1 + U_2 + \cdots + U_{n-1} \leq 1 < U_1 + U_2 + \cdots + U_n)\). We can rewrite this as: $$ P\left( U_1 + U_2 + \cdots + U_{n-1} \leq 1 - U_n < U_1 + U_2 + \cdots + U_{n-1} \right) $$ Since \(U_n\) is uniformly distributed over \((0,1)\), the probability that the sum of the first \(n-1\) variables lies within a certain interval is equal to the size of that interval. #Step 2: Joint probability density function#
02

Joint PDF

Now, we need to find the joint probability density function (PDF) of the first \(n - 1\) uniform random variables, \(U_1, U_2, \ldots, U_{n-1}\). Since these variables are independent and identically distributed, their joint PDF can be obtained as the product of their individual PDFs: $$ f_{U_1, U_2, \ldots, U_{n-1}}(u_1, u_2, \ldots, u_{n-1}) = \prod_{i=1}^{n-1} f_{U_i}(u_i) $$ Since each \(U_i\) has PDF equal to 1 on the interval \((0,1)\), the joint PDF is also equal to 1 on the \((n-1)\)-dimensional unit cube where \(0 < u_i < 1\) for all \(i\). #Step 3: Probability of the sum being in the given range#
03

Probability of Sum in Range

Given the joint PDF of the first \(n-1\) random variables, we can find the probability that their sum lies in the given range: $$ P\left( U_1 + U_2 + \cdots + U_{n-1} \leq 1 - U_n < U_1 + U_2 + \cdots + U_{n-1} \right) = \int_{0}^{1} \cdots \int_{0}^{1 - u_n} f_{U_1, U_2, \ldots, U_{n-1}}(u_1, u_2, \ldots, u_{n-1}) du_1du_2 \cdots du_{n-1} $$ Since the joint PDF is equal to 1 on the integration domain, this probability simply becomes: $$ P\left( N=n \right) = \int_{0}^{1} \cdots \int_{0}^{1 - u_n} du_1du_2 \cdots du_{n-1} $$ #Step 4: Finding the expected value of N#
04

Expected Value of N

Now that we have an expression for the probability of N being equal to a specific value \(n\), we can find the expected value \(E[N]\) by summing the weighted probabilities for all integer values of \(n \geq 1\): $$ E[N] = \sum_{n=1}^{\infty} nP(N=n) = \sum_{n=1}^{\infty} n \int_{0}^{1} \cdots \int_{0}^{1 - u_n} du_1du_2 \cdots du_{n-1} $$ By plugging the expression for \(P(N=n)\) found in Step 3 and carefully evaluating the integral, we can find the expected value: $$ E[N] = \sum_{n=1}^{\infty} n \int_{0}^{1} \cdots \int_{0}^{1 - u_n} du_1du_2 \cdots du_{n-1} = \sum_{n=1}^{\infty} \frac{1}{n} $$ It turns out that this sum diverges, which means \(E[N]\) is infinite. So, the expectation of \(N\) is infinite, demonstrating that we cannot expect a finite number of uniformly distributed random variables to sum to more than 1. It's important to note that this result is quite counter-intuitive, as in practice the sum converges very quickly, but mathematically the expectation is infinite.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

There are three machines, all of which are needed for a system to work. Machine \(i\) functions for an exponential time with rate \(\lambda_{i}\) before it fails, \(i=1,2,3 .\) When a machine fails, the system is shut down and repair begins on the failed machine. The time to fix machine 1 is exponential with rate \(5 ;\) the time to fix machine 2 is uniform on \((0,4) ;\) and the time to fix machine 3 is a gamma random variable with parameters \(n=3\) and \(\lambda=2 .\) Once a failed machine is repaired, it is as good as new and all machines are restarted. (a) What proportion of time is the system working? (b) What proportion of time is machine 1 being repaired? (c) What proportion of time is machine 2 in a state of suspended animation (that is, neither working nor being repaired)?

Each of \(n\) skiers continually, and independently, climbs up and then skis down a particular slope. The time it takes skier \(i\) to climb up has distribution \(F_{i}\), and it is independent of her time to ski down, which has distribution \(H_{i}, i=1, \ldots, n\). Let \(N(t)\) denote the total number of times members of this group have skied down the slope by time \(t .\) Also, let \(U(t)\) denote the number of skiers climbing up the hill at time \(t\). (a) What is \(\lim _{t \rightarrow \infty} N(t) / t\) ? (b) Find \(\lim _{t \rightarrow \infty} E[U(t)]\). (c) If all \(F_{i}\) are exponential with rate \(\lambda\) and all \(G_{i}\) are exponential with rate \(\mu\), what is \(P\\{U(t)=k\\} ?\)

A taxi alternates between three different locations. Whenever it reaches location \(i\), it stops and spends a random time having mean \(t_{i}\) before obtaining another passenger, \(i=1,2,3 .\) A passenger entering the cab at location \(i\) will want to go to location \(j\) with probability \(P_{i j} .\) The time to travel from \(i\) to \(j\) is a random variable with mean \(m_{i j} .\) Suppose that \(t_{1}=1, t_{2}=2, t_{3}=4, P_{12}=1, P_{23}=1, P_{31}=\frac{2}{3}=1-P_{32}\) \(m_{12}=10, m_{23}=20, m_{31}=15, m_{32}=25 .\) Define an appropriate semi- Markov process and determine (a) the proportion of time the taxi is waiting at location \(i\), and (b) the proportion of time the taxi is on the road from \(i\) to \(j, i, j=1,2,3\).

A machine in use is replaced by a new machine either when it fails or when it reaches the age of \(T\) years. If the lifetimes of successive machines are independent with a common distribution \(F\) having density \(f\), show that (a) the long-run rate at which machines are replaced equals $$ \left[\int_{0}^{T} x f(x) d x+T(1-F(T))\right]^{-1} $$ (b) the long-run rate at which machines in use fail equals $$ \frac{F(T)}{\int_{0}^{T} x f(x) d x+T[1-F(T)]} $$

Consider a single-server bank for which customers arrive in accordance with a Poisson process with rate \(\lambda .\) If a customer will enter the bank only if the server is free when he arrives, and if the service time of a customer has the distribution \(G\), then what proportion of time is the server busy?

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free