Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Let \(N\) balls be thrown independently into \(n\) urns, each ball having probaliility \(1 / n\) of falling into any particular urn. Iet \(Z_{N, n}\) be the number of empty urus after culminating these tosses, and let \(P_{N, n}(k)=\operatorname{Pr}\left(Z_{N, n}=k\right)\). I).fine \(\varphi_{N, m}(t)=\sum_{k=0}^{n} P_{N, n}(k) e^{i k t}\) (a) Show that $$ P_{N+1, n}(k)=\left(1-\frac{k}{n}\right) P_{N, n}(k)+\frac{k+1}{n} P_{N, n}(k+1), \text { for } k=0,1, \ldots, n $$ (I) Show that $$ V_{N, n}(k)=\left(1-\frac{1}{n}\right)^{N} P_{N, n-1}(k-1)+\sum_{i=1}^{N}\left(\begin{array}{c} N \\ i \end{array}\right) \frac{1}{n^{i}}\left(1-\frac{1}{n}\right)^{N-i} P_{N-i_{n-1}}(k) $$ (v) I)efin: \(G_{n}(t, z)=\sum_{N=0}^{\infty} \varphi_{N, n}(t) \frac{n^{N}}{N !} z^{N}\), Using part \((b)\), show that \(G_{n}(t, z)=\) Cin \(_{1}(t, z)\left(e^{i t}+e^{2}-1\right)\), and conclude that $$ G_{n}(t, z)=\left(e^{l t}+e^{z}-1\right)^{n}, \quad n=0,1,2, \ldots $$

Short Answer

Expert verified
In summary, we derived the following results: (a) Recurrence relation for \(P_{N+1,n}(k)\): $$ P_{N+1,n}(k)=\left(1-\frac{k}{n}\right) P_{N,n}(k)+\frac{k+1}{n} P_{N,n}(k+1), $$ for \(k=0,1,\ldots,n.\) (b) Expression for \(V_{N, n}(k)\): $$ V_{N, n}(k)=\left(1-\frac{1}{n}\right)^{N} P_{N, n-1}(k-1)+\sum_{i=1}^{N}\left(\begin{array}{c} N \\\ i \end{array}\right) \frac{1}{n^{i}}\left(1-\frac{1}{n}\right)^{N-i} P_{N-i_{n-1}}(k) $$ (c) Generating function \(G_{n}(t, z)\): $$ G_{n}(t, z)=\left(e^{l t}+e^{z}-1\right)^{n} $$ for \(n=0,1,2,\ldots\).

Step by step solution

01

Part (a) Recurrence Relation for \(P_{N+1,n}(k)\)

- First of all, notice that when adding a new ball (\(N+1\)), it can either go into one of the \(k\) empty urns or into one of the \(n-k\) non-empty urns. - If the new ball goes into one of the non-empty urns, the probability of this happening is \(\frac{n-k}{n}\), and the number of empty urns remains the same (\(k\)). - If the new ball goes into one of the \(k\) empty urns, the probability of this happening is \(\frac{k}{n}\), and the number of empty urns decreases by 1 (\(k-1\)). - With this information, we can now express \(P_{N+1, n}(k)\) as a combination of probabilities from the previous configuration (\(P_{N, n}\)): $$ P_{N+1,n}(k)=\left(1-\frac{k}{n}\right) P_{N,n}(k)+\frac{k+1}{n} P_{N,n}(k+1), $$ for \(k=0,1,\ldots,n.\)
02

Part (b) Expression for \(V_{N, n}(k)\)

- We need to express \(V_{N, n}(k)\) in terms of the probabilities of \(P_{N, n-1}(k-1)\) and sums involving the binomial coefficient. - First, note that \(\left(1-\frac{1}{n}\right)^{N}\) corresponds to the probability of not selecting a particular urn \(N\) times. Therefore, this term gives rise to the probability of one specific urn being empty. - Now, multiply this probability with \(P_{N, n-1}(k-1)\), which gives us the probability of having \(k-1\) empty urns for the situation \(n-1\). Adding these probabilities, we get the probability of having \(k\) empty urns for the situation \(n\). - For the second part, we consider selecting \(i\) specific urns and placing balls inside them. The probability for this is given by: $$ \left(\begin{array}{c} N \\\ i \end{array}\right) \frac{1}{n^{i}}\left(1-\frac{1}{n}\right)^{N-i} $$ - Multiply this probability with \(P_{N-i, n-1}(k)\) to get the probability of having \(k\) empty urns after placing balls inside the \(i\) specific urns. - Summing over all possible \(i = 1, 2, ..., N\), we get the complete expression for \(V_{N, n}(k)\): $$ V_{N, n}(k)=\left(1-\frac{1}{n}\right)^{N} P_{N, n-1}(k-1)+\sum_{i=1}^{N}\left(\begin{array}{c} N \\\ i \end{array}\right) \frac{1}{n^{i}}\left(1-\frac{1}{n}\right)^{N-i} P_{N-i_{n-1}}(k) $$
03

Part (c) Generating Function \(G_{n}(t, z)\)

- We are given the generating function definition: $$ G_{n}(t, z)=\sum_{N=0}^{\infty} \varphi_{N,n}(t) \frac{n^{N}}{N !} z^{N} $$ - Using the result from part (a) in the definition of the generating function, we get: $$ G_{n}(t, z) = \text{Cin}_{1}(t, z)\left(e^{it}+e^{z}-1\right) $$ - With this, we can now conclude that: $$ G_{n}(t, z)=\left(e^{l t}+e^{z}-1\right)^{n} $$ for \(n=0,1,2,\ldots\).

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Suppose we have \(N\) chips marked \(1,2, \ldots, N\), respectively. We take a random nample of size \(2 n+1\) without replacement. Let \(Y\) be the median of the random rample. Show that the probability function of \(Y\) is $$ \operatorname{Pr}\\{Y=k\\}=\frac{\left(\begin{array}{c} k-1 \\ n \end{array}\right)\left(\begin{array}{c} N-k \\ n \end{array}\right)}{\left(\begin{array}{c} N \\ 2 n+1 \end{array}\right)} \quad \text { for } k=n+1, n+2, \ldots, N-n $$ Verify $$ E(Y)=\frac{N+1}{2} \quad \text { and } \quad \operatorname{Var}(Y)=\frac{(N-2 n-1)(N+1)}{8 n+12} . $$

Let \(X\) and \(Y\) be independent, identically distributed, positive random variables with continuous density function \(f(x)\). Assume, further, that \(U=\) \(X-Y\) and \(V=\min (X, Y)\) are independent random variables. Prove that $$ f(x)= \begin{cases}\lambda e^{-\lambda x} & \text { for } x \geq 0 \\ 0 & \text { elsewhere, }\end{cases} $$ for some \(\lambda>0 .\) Assume \(f(0)>0\) Hint: Show first that the joint density function of \(U\) and \(V\) is $$ f_{U, V}(u, v)=f(v) f(v+|u|) $$ Next, equate this with the produet of the marginal densities for \(U, V\),

Let \(A_{0}, A_{1}, \ldots, A_{r}\), be \(r+1\) events which can oceur as outcomes of an experiment. Let \(p_{l}\) be the probability of the occurrence of \(A_{l}(i=0,1,2, \ldots, r)\). Suppose we perform independent trials until the event \(A_{0}\) occurs \(k\) times. Let \(X_{i}\) be the number of occurrences of the event \(A_{i} .\) Show that \(\operatorname{Pr}\left\\{X_{1}=x_{1}, \ldots, X_{r}=x_{r} ; A_{0}\right.\) oceurs for the \(k\) th time at the \(\left(k+\sum_{i=1}^{r} x_{i}\right)\) th trial \(\\}\) $$ =\frac{\Gamma\left(k+\sum_{i=1}^{r} x_{i}\right)}{\Gamma(k) \prod_{i=1}^{r} x_{i} !} p_{0}^{k} \prod_{i=1}^{r} p_{i^{i}}^{x_{i}} $$

Show that the probability generating function of the negative multinomial distribution (I) with parameters \(\left(k ; p_{0}, p_{1}, \ldots, p_{r}\right)\) is $$ \varphi\left(t_{1}, \ldots, t_{r}\right)=p_{0}^{k}\left(1-\sum_{i=1}^{r} t_{i P i}\right)^{-k} $$

For each given \(p\), let \(X\) have a binomial distribution with parameters \(p\) and N. Suppose \(P\) is distributed according to a beta distribution with parameters \(r\) and \(s\). Find the resulting distribution of \(X\). When is this distribution uniform on \(x=9,1, \ldots, N ?\)

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free