Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Using the central limit theorem for suitable Poisson random variables, prove that $$ \lim _{n \rightarrow \infty} e^{-n} \sum_{k=0}^{n} \frac{n^{k}}{k !}=\frac{1}{2} $$

Short Answer

Expert verified
In summary, given a sequence of i.i.d Poisson random variables \(X_k\) with parameter \(\lambda = 1\), we defined \(Y_n = \sum_{k=1}^{n} X_k\). Using the Central Limit Theorem and standard normal distribution, we derived the probability \(P(Y_n \leq n)\) and took the limit as \(n \rightarrow \infty\), yielding \(\frac{1}{2}\). This result proves the given expression \(\lim _{n \rightarrow \infty} e^{-n} \sum_{k=0}^{n} \frac{n^{k}}{k!}=\frac{1}{2}\).

Step by step solution

01

Define the Poisson Random Variables

Let \(X_k\) be a sequence of i.i.d Poisson random variables, each with parameter \(\lambda = 1\). The probability mass function for each \(X_k\) can be defined as: $$ P(X_k = n) = \frac{e^{-1}(1)^n}{n!} $$ Now let's define another random variable \(Y_n\) which represents the sum of these \(n\) random variables. $$ Y_n = \sum_{k=1}^{n} X_k $$ The expected value of \(Y_n\) is \(E[Y_n] = n\lambda = n\).
02

Apply the Central Limit Theorem

By the Central Limit Theorem, as \(n \rightarrow \infty\), the standardized sum \((Y_n - E[Y_n])/ \sqrt{n}\) will converge in distribution to a standard normal distribution, \(Z \sim N(0, 1)\). So, we can write: $$ \lim_{n \rightarrow \infty} P\left(\frac{Y_n - E[Y_n]}{\sqrt{n}} \leq z \right) = \Phi(z) $$ where \(\Phi(z)\) is the cumulative distribution function of the standard normal distribution.
03

Calculate the Probability

Now, let's calculate the probability of interest, \(P(Y_n \leq n)\): $$ P(Y_n \leq n) = P\left( Y_n - E[Y_n] \leq 0 \right) $$ Now, we standardize the sum to apply the CLT: $$ P\left( -\frac{- Y_n + E[Y_n]}{\sqrt{n}} \leq \frac{ E[Y_n] - Y_n }{\sqrt{n}} \right) = \Phi\left( - \frac{- Y_n + E[Y_n]}{\sqrt{n}} \right) $$ Take the limit as \(n \rightarrow \infty\): $$ \lim_{n \rightarrow \infty} P(Y_n \leq n) = \lim_{n \rightarrow \infty} \Phi\left( \frac{E[Y_n]-Y_n}{\sqrt{n}} \right) $$ By the symmetry of the standard normal distribution, we have: $$ \lim_{n \rightarrow \infty} P(Y_n \leq n) = \lim_{n \rightarrow \infty} \Phi\left( 0 \right) = \frac{1}{2} $$
04

Establish the Given Expression

Finally, we need to prove that the given expression is equal to the result we just derived: $$ \lim _{n \rightarrow \infty} e^{-n} \sum_{k=0}^{n} \frac{n^{k}}{k!}=\frac{1}{2} $$ Recall that \(Y_n\) is the sum of \(n\) independent Poisson random variables, each with parameter \(\lambda = 1\). Therefore, \(Y_n\) follows a Poisson distribution with parameter \(n\), and its probability mass function is: $$ P(Y_n=k) = \frac{e^{-n}n^k}{k!} $$ So, we can rewrite the given expression as: $$ \lim _{n \rightarrow \infty} P(Y_n \leq n)=\frac{1}{2} $$ This is the same expression we arrived at in step 3, proving the given expression.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

(a) Suppose \(X\) is distributed according to a Poisson distribution with parameter \(\hat{2} .\) The parameter \(\lambda\) is itself a random variable whose distribution law is exponential with mean \(=1 / c .\) Find the distribution of \(X\). (b) What if \(\lambda\) follows a gamma distribution of order \(\alpha\) with scale parameter \(c\), i.e., the density of \(\lambda\) is \(e^{\alpha+1} \frac{\lambda^{a}}{\Gamma(\alpha+1)} e^{-\lambda c}\) for \(\lambda>0 ; 0\) for \(\lambda \leq 0\)

Suppose that a lot consists of \(m, n_{1}, \ldots, n_{r}\), items belonging to the \(0 \mathrm{th}\), (1n1,..., \(r\) th classes respeetively. The items are drawn one-by-one without replace. ment until \(k\) items of the 0 th elass are observed. Show that the joint aistribution of the observed frequencies \(X_{1}, \ldots, X_{r}\) of the Ist,..., \(r\) th classes is $$ \begin{gathered} \operatorname{Pr}\left\\{X_{1}=x_{1}, \ldots, X_{r}=x_{r}\right\\}=\left\\{\left(\begin{array}{c} m \\ k-1 \end{array}\right) \prod_{i=1}^{r}\left(\begin{array}{l} n_{i} \\ x_{i} \end{array}\right) /\left(\begin{array}{c} m+n \\ k+y-1 \end{array}\right)\right\\} \\ \cdot \frac{m-(k-1)}{m+n-(k+y-1)} \end{gathered} $$ where $$ y=\sum_{i=1}^{r} x_{1} \quad \text { and } \quad n=\sum_{i=1}^{r} n_{i^{*}} $$

For each given \(p\) let \(X\) have a binomial distribution with parameters \(p\) and N. Suppose \(N\) is itself binomially distributed with parameters \(q\) and \(M, M \geq N\). (a) Show analytieally that \(X\) has a binomial distribution with parameters \(p q\) and \(M_{0}\) (b) Give a probabilistie argument for this result.

Let \(N\) balls be thrown independently into \(n\) urns, each ball having probaliility \(1 / n\) of falling into any particular urn. Iet \(Z_{N, n}\) be the number of empty urus after culminating these tosses, and let \(P_{N, n}(k)=\operatorname{Pr}\left(Z_{N, n}=k\right)\). I).fine \(\varphi_{N, m}(t)=\sum_{k=0}^{n} P_{N, n}(k) e^{i k t}\) (a) Show that $$ P_{N+1, n}(k)=\left(1-\frac{k}{n}\right) P_{N, n}(k)+\frac{k+1}{n} P_{N, n}(k+1), \text { for } k=0,1, \ldots, n $$ (I) Show that $$ V_{N, n}(k)=\left(1-\frac{1}{n}\right)^{N} P_{N, n-1}(k-1)+\sum_{i=1}^{N}\left(\begin{array}{c} N \\ i \end{array}\right) \frac{1}{n^{i}}\left(1-\frac{1}{n}\right)^{N-i} P_{N-i_{n-1}}(k) $$ (v) I)efin: \(G_{n}(t, z)=\sum_{N=0}^{\infty} \varphi_{N, n}(t) \frac{n^{N}}{N !} z^{N}\), Using part \((b)\), show that \(G_{n}(t, z)=\) Cin \(_{1}(t, z)\left(e^{i t}+e^{2}-1\right)\), and conclude that $$ G_{n}(t, z)=\left(e^{l t}+e^{z}-1\right)^{n}, \quad n=0,1,2, \ldots $$

Let \(X\) be a nonnegative integer-valued random variable with probability generating function \(f(s)=\sum_{n=0}^{\infty} a_{n} s^{n}\), After observing \(X\), then conduct \(X\) binomial trials with probability \(p\) of success. Let \(Y\) denote the resulting number of successes. (a) Determine the probability generating function of \(Y\). (b) Determine the probability generating funetion of \(X\) given that \(Y=X\).

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free