Chapter 2: Problem 8
Suppose the distribution function of \(X\) is given by $$ F(b)=\left\\{\begin{array}{ll} 0, & b<0 \\ \frac{1}{2}, & 0 \leq b<1 \\ 1, & 1 \leq b<\infty \end{array}\right. $$ What is the probability mass function of \(X ?\)
Chapter 2: Problem 8
Suppose the distribution function of \(X\) is given by $$ F(b)=\left\\{\begin{array}{ll} 0, & b<0 \\ \frac{1}{2}, & 0 \leq b<1 \\ 1, & 1 \leq b<\infty \end{array}\right. $$ What is the probability mass function of \(X ?\)
All the tools & learning materials you need for study success - in one app.
Get started for freeLet \(X_{1}, X_{2}, \ldots\) be a sequence of independent identically distributed continuous random variables. We say that a record occurs at time \(n\) if \(X_{n}>\max \left(X_{1}, \ldots, X_{n-1}\right)\) That is, \(X_{n}\) is a record if it is larger than each of \(X_{1}, \ldots, X_{n-1}\). Show (a) \(P(\) a record occurs at time \(n\\}=1 / n ;\) (b) \(E[\) number of records by time \(n]=\sum_{i=1}^{n} 1 / i ;\) (c) \(\operatorname{Var}(\) number of records by time \(n)=\sum_{i=1}^{n}(i-1) / i^{2}\); (d) Let \(N=\min (n: n>1\) and a record occurs at time \(n] .\) Show \(E[N]=\infty\). Hint: For (ii) and (iii) represent the number of records as the sum of indicator (that is, Bernoulli) random variables.
Let \(c\) be a constant. Show that (a) \(\operatorname{Var}(c X)=c^{2} \operatorname{Var}(X)\) (b) \(\operatorname{Var}(c+X)=\operatorname{Var}(X)\).
Suppose that the joint probability mass function of \(X\) and \(Y\) is $$ P(X=i, Y=j)=\left(\begin{array}{l} j \\ i \end{array}\right) e^{-2 \lambda} \lambda^{i} / j !, \quad 0 \leq i \leq j $$ (a) Find the probability mass function of \(Y\). (b) Find the probability mass function of \(X\). (c) Find the probability mass function of \(Y-X\).
Show that $$ \lim _{n \rightarrow \infty} e^{-n} \sum_{k=0}^{n} \frac{n^{k}}{k !}=\frac{1}{2} $$ Hint: Let \(X_{n}\) be Poisson with mean \(n\). Use the central limit theorem to show that \(P\left\\{X_{n} \leq n\right\\} \rightarrow \frac{1}{2}\)
A total of \(r\) keys are to be put, one at a time, in \(k\) boxes, with each key independently being put in box \(i\) with probability \(p_{i}, \sum_{i=1}^{k} p_{i}=1 .\) Each time a key is put in a nonempty box, we say that a collision occurs. Find the expected number of collisions.
What do you think about this solution?
We value your feedback to improve our textbook solutions.