Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

$$ \text { (a) Let } X \text { and } Y \text { be independent random variables such that } $$ $$ \begin{aligned} &\operatorname{Pr}\\{X=i\\}=f(i), \quad \operatorname{Pr}\\{Y=i\\}=g(i) \\ &f(i)>0, \quad g(i)>0, \quad i=0,1,2, \ldots \end{aligned} $$ and $$ \sum_{i=0}^{\infty} f(i)=\sum_{i=0}^{\infty} \mathrm{g}(i)=1 $$ Suppose $$ \operatorname{Pr}\\{X=k \mid X+Y=l\\}=\left\\{\begin{array}{cc} \left(\begin{array}{l} l \\ k \end{array}\right) p^{k}(1-p)^{1-k}, & 0 \leq k \leq l, \\ 0, & k>l . \end{array}\right. $$ Prove that $$ f(i)=e^{-\theta x} \frac{(\theta \alpha)^{i}}{i !}, \quad g(i)=\mathrm{e}^{-\theta} \frac{\theta^{i}}{i !}, \quad \alpha=0,1,2, \ldots $$ where \(\alpha=p /(1-p)\) and \(\theta>0\) is arbitrary. (b) Show that \(p\) is determined by the condition. $$ G\left(\frac{1}{1-p}\right)=\frac{1}{f(0)} $$ Ilint: Let \(F(s)=\sum f(i) s^{I}, G(s)=\sum g(i) s^{i} .\) Establish first the relation $$ F(u) F(v)=F(v p+(1-p) u) G(v p+(1-p) u) $$

Short Answer

Expert verified
In summary, the expressions for f(i) and g(i) are found by analyzing the joint probability of X=k and Y=l-k and summing these probabilities over all possible values of k and l. We then find the value of p using the provided hint and the derived expressions for f(i) and g(i), which is determined by the condition G(1/(1-p)) = 1/f(0).

Step by step solution

01

Finding the joint probability of X=k and Y=l-k

Given the conditional probability Pr(X=k|X+Y=l), we can find the joint probability using the following formula: Pr(X=k, Y=l-k) = Pr(X=k|X+Y=l) * Pr(Y=l-k) Since X and Y are random variables, we know that: Pr(X=k) = f(k) Pr(Y=l-k) = g(l-k) So, we have: Pr(X=k, Y=l-k) = Pr(X=k|X+Y=l) * g(l-k) Using the provided expression for Pr(X=k|X+Y=l): Pr(X=k, Y=l-k) = \(\binom{l}{k}\) p^k (1-p)^{l-k} * g(l-k)
02

Expressions for f(i) and g(i)

Now, we need to find the expressions for f(i) and g(i) using the joint probability. We know that: f(i) = Pr(X=i) g(i) = Pr(Y=i) Summing the joint probability over all possible values of k for a fixed i, we get: f(i) = \(\sum_{k=0}^{i}\) Pr(X=k, Y=i-k) Substitute the expression for the joint probability we derived in Step 1: f(i) = \(\sum_{k=0}^{i}\) \(\binom{i}{k}\) p^k (1-p)^{i-k} * g(i-k) Now, summing the joint probability over all possible values of l for a fixed i, we get: g(i) = \(\sum_{l=i}^{\infty}\) Pr(X=l-i, Y=i) Substitute the expression for the joint probability we derived in Step 1: g(i) = \(\sum_{l=i}^{\infty}\) \(\binom{l}{l-i}\) p^{l-i} (1-p)^{i} * g(i)
03

Finding the value of p

Now, we need to find the value of p based on the expressions for f(i) and g(i). From the expressions of f(i) and g(i), we get: \(\frac{f(i)}{g(i)}\) = \(\frac{\sum_{k=0}^{i} \binom{i}{k} p^k (1-p)^{i-k}}{\sum_{l=i}^{\infty} \binom{l}{l-i} p^{l-i} (1-p)^{i}}\) By simplifying and using the hint provided, let F(s) = \(\sum f(i) s^i\) and G(s) = \(\sum g(i) s^i\). We can establish the following relation: F(u) F(v) = F(vp + (1-p)u) G(vp + (1-p)u) Now, we have to use this relation and the expressions we derived for f(i) and g(i) to find the value of p. We know that: F(1/(1-p)) * G(1/(1-p)) = F(0) * G(1) Since F(0) = f(0), G(1) = 1, and G(1/(1-p)) = 1/f(0), we find: 1/f(0) = p This shows that the value of p is determined by the condition G(1/(1-p)) = 1/f(0).

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Suppose that a lot consists of \(m, n_{1}, \ldots, n_{r}\), items belonging to the \(0 \mathrm{th}\), (1n1,..., \(r\) th classes respeetively. The items are drawn one-by-one without replace. ment until \(k\) items of the 0 th elass are observed. Show that the joint aistribution of the observed frequencies \(X_{1}, \ldots, X_{r}\) of the Ist,..., \(r\) th classes is $$ \begin{gathered} \operatorname{Pr}\left\\{X_{1}=x_{1}, \ldots, X_{r}=x_{r}\right\\}=\left\\{\left(\begin{array}{c} m \\ k-1 \end{array}\right) \prod_{i=1}^{r}\left(\begin{array}{l} n_{i} \\ x_{i} \end{array}\right) /\left(\begin{array}{c} m+n \\ k+y-1 \end{array}\right)\right\\} \\ \cdot \frac{m-(k-1)}{m+n-(k+y-1)} \end{gathered} $$ where $$ y=\sum_{i=1}^{r} x_{1} \quad \text { and } \quad n=\sum_{i=1}^{r} n_{i^{*}} $$

Consider an infinite number of urns into which we toss balls independently, in such a way that a ball falls into the \(k\) th urn with probability \(1 / 2^{k}, k=1,2,3\). \(\ldots .\) For each positive integer \(N\), let \(Z_{N}\) be the number of urns which contain at least one ball after a total of \(N\) balls have been tossed. Show that $$ E\left(Z_{N}\right)=\sum_{\lambda=1}^{\infty}\left[1-\left(1-1 / 2^{k}\right)^{N}\right] $$ and that there exist constants \(C_{1}>0\) and \(C_{2}>0\) such that $$ C_{1} \log N \leq E\left(Z_{N}\right) \leq C_{2} \log N \quad \text { for all } N $$ Hint: Verify and use the facts: $$ E\left(Z_{N}\right) \geq \sum_{k=1}^{\log _{2}-N}\left[1-\left(1-\frac{1}{2^{k}}\right)^{N}\right] \geq C \log _{2} N $$ and $$ 1-\left(1-\frac{1}{2^{k}}\right)^{N} \leq N \frac{1}{2^{k}} \text { and } N \sum_{\log _{2}}^{\infty} \frac{1}{2^{k}} \leq C_{2} $$

Let \(A_{0}, A_{1}, \ldots, A_{r}\), be \(r+1\) events which can oceur as outcomes of an experiment. Let \(p_{l}\) be the probability of the occurrence of \(A_{l}(i=0,1,2, \ldots, r)\). Suppose we perform independent trials until the event \(A_{0}\) occurs \(k\) times. Let \(X_{i}\) be the number of occurrences of the event \(A_{i} .\) Show that \(\operatorname{Pr}\left\\{X_{1}=x_{1}, \ldots, X_{r}=x_{r} ; A_{0}\right.\) oceurs for the \(k\) th time at the \(\left(k+\sum_{i=1}^{r} x_{i}\right)\) th trial \(\\}\) $$ =\frac{\Gamma\left(k+\sum_{i=1}^{r} x_{i}\right)}{\Gamma(k) \prod_{i=1}^{r} x_{i} !} p_{0}^{k} \prod_{i=1}^{r} p_{i^{i}}^{x_{i}} $$

Let \(X\) be a nonnegative random variable with cumulative distribution funetion \(F(x)=\operatorname{Pr}\\{X \leq x\\}\). Show $$ E[X]=\int_{0}^{\infty}[1-F(x)] d x $$ Hint: Write \(E[X]=\int^{\infty} x d F(x)=\int^{\infty}\left(\int_{0}^{x} d y\right) d F(x)\).

For each given \(p\) let \(X\) have a binomial distribution with parameters \(p\) and N. Suppose \(N\) is itself binomially distributed with parameters \(q\) and \(M, M \geq N\). (a) Show analytieally that \(X\) has a binomial distribution with parameters \(p q\) and \(M_{0}\) (b) Give a probabilistie argument for this result.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free