Chapter 1: Problem 18
Using the central limit theorem for suitable Poisson random variables, prove that $$ \lim _{n \rightarrow \infty} e^{-n} \sum_{k=0}^{n} \frac{n^{k}}{k !}=\frac{1}{2} $$
Chapter 1: Problem 18
Using the central limit theorem for suitable Poisson random variables, prove that $$ \lim _{n \rightarrow \infty} e^{-n} \sum_{k=0}^{n} \frac{n^{k}}{k !}=\frac{1}{2} $$
All the tools & learning materials you need for study success - in one app.
Get started for free(a) Suppose \(X\) is distributed according to a Poisson distribution with parameter \(\hat{2} .\) The parameter \(\lambda\) is itself a random variable whose distribution law is exponential with mean \(=1 / c .\) Find the distribution of \(X\). (b) What if \(\lambda\) follows a gamma distribution of order \(\alpha\) with scale parameter \(c\), i.e., the density of \(\lambda\) is \(e^{\alpha+1} \frac{\lambda^{a}}{\Gamma(\alpha+1)} e^{-\lambda c}\) for \(\lambda>0 ; 0\) for \(\lambda \leq 0\)
Suppose that a lot consists of \(m, n_{1}, \ldots, n_{r}\), items belonging to the \(0 \mathrm{th}\), (1n1,..., \(r\) th classes respeetively. The items are drawn one-by-one without replace. ment until \(k\) items of the 0 th elass are observed. Show that the joint aistribution of the observed frequencies \(X_{1}, \ldots, X_{r}\) of the Ist,..., \(r\) th classes is $$ \begin{gathered} \operatorname{Pr}\left\\{X_{1}=x_{1}, \ldots, X_{r}=x_{r}\right\\}=\left\\{\left(\begin{array}{c} m \\ k-1 \end{array}\right) \prod_{i=1}^{r}\left(\begin{array}{l} n_{i} \\ x_{i} \end{array}\right) /\left(\begin{array}{c} m+n \\ k+y-1 \end{array}\right)\right\\} \\ \cdot \frac{m-(k-1)}{m+n-(k+y-1)} \end{gathered} $$ where $$ y=\sum_{i=1}^{r} x_{1} \quad \text { and } \quad n=\sum_{i=1}^{r} n_{i^{*}} $$
For each given \(p\) let \(X\) have a binomial distribution with parameters \(p\) and N. Suppose \(N\) is itself binomially distributed with parameters \(q\) and \(M, M \geq N\). (a) Show analytieally that \(X\) has a binomial distribution with parameters \(p q\) and \(M_{0}\) (b) Give a probabilistie argument for this result.
Let \(N\) balls be thrown independently into \(n\) urns, each ball having probaliility \(1 / n\) of falling into any particular urn. Iet \(Z_{N, n}\) be the number of empty urus after culminating these tosses, and let \(P_{N, n}(k)=\operatorname{Pr}\left(Z_{N, n}=k\right)\). I).fine \(\varphi_{N, m}(t)=\sum_{k=0}^{n} P_{N, n}(k) e^{i k t}\) (a) Show that $$ P_{N+1, n}(k)=\left(1-\frac{k}{n}\right) P_{N, n}(k)+\frac{k+1}{n} P_{N, n}(k+1), \text { for } k=0,1, \ldots, n $$ (I) Show that $$ V_{N, n}(k)=\left(1-\frac{1}{n}\right)^{N} P_{N, n-1}(k-1)+\sum_{i=1}^{N}\left(\begin{array}{c} N \\ i \end{array}\right) \frac{1}{n^{i}}\left(1-\frac{1}{n}\right)^{N-i} P_{N-i_{n-1}}(k) $$ (v) I)efin: \(G_{n}(t, z)=\sum_{N=0}^{\infty} \varphi_{N, n}(t) \frac{n^{N}}{N !} z^{N}\), Using part \((b)\), show that \(G_{n}(t, z)=\) Cin \(_{1}(t, z)\left(e^{i t}+e^{2}-1\right)\), and conclude that $$ G_{n}(t, z)=\left(e^{l t}+e^{z}-1\right)^{n}, \quad n=0,1,2, \ldots $$
Let \(X\) be a nonnegative integer-valued random variable with probability generating function \(f(s)=\sum_{n=0}^{\infty} a_{n} s^{n}\), After observing \(X\), then conduct \(X\) binomial trials with probability \(p\) of success. Let \(Y\) denote the resulting number of successes. (a) Determine the probability generating function of \(Y\). (b) Determine the probability generating funetion of \(X\) given that \(Y=X\).
What do you think about this solution?
We value your feedback to improve our textbook solutions.