Chapter 3: Problem 4
Let the independent random variables \(X_{1}, X_{2}, \ldots, X_{40}\) be iid
with the common pdf \(f(x)=3 x^{2}, 0
Chapter 3: Problem 4
Let the independent random variables \(X_{1}, X_{2}, \ldots, X_{40}\) be iid
with the common pdf \(f(x)=3 x^{2}, 0
All the tools & learning materials you need for study success - in one app.
Get started for freeShow, for \(k=1,2, \ldots, n\), that $$ \int_{p}^{1} \frac{n !}{(k-1) !(n-k) !} z^{k-1}(1-z)^{n-k} d z=\sum_{x=0}^{k-1}\left(\begin{array}{l} n \\ x \end{array}\right) p^{x}(1-p)^{n-x} . $$ This demonstrates the relationship between the cdfs of the \(\beta\) and binomial distributions.
One way of estimating the number of fish in a lake is the following capturerecapture sampling scheme. Suppose there are \(N\) fish in the lake where \(N\) is unknown. A specified number of fish \(T\) are captured, tagged, and released back to the lake. Then at a specified time and for a specified positive integer \(r\), fish are captured until the \(r t h\) tagged fish is caught. The random variable of interest is \(Y\) the number of nontagged fish caught. (a) What is the distribution of \(Y ?\) Identify all parameters. (b) What is \(E(Y)\) and the \(\operatorname{Var}(Y)\) ? (c) The method of moment estimate of \(N\) is to set \(Y\) equal to the expression for \(E(Y)\) and solve this equation for \(N .\) Call the solution \(\hat{N}\). Determine \(\hat{N}\). (d) Determine the mean and variance of \(\hat{N}\).
Let $$ p\left(x_{1}, x_{2}\right)=\left(\begin{array}{l} x_{1} \\ x_{2} \end{array}\right)\left(\frac{1}{2}\right)^{x_{1}}\left(\frac{x_{1}}{15}\right), \begin{aligned} &x_{2}=0,1, \ldots, x_{1} \\ &x_{1}=1,2,3,4,5 \end{aligned} $$ zero elsewhere, be the joint pmf of \(X_{1}\) and \(X_{2}\). Determine (a) \(E\left(X_{2}\right)\). (b) \(u\left(x_{1}\right)=E\left(X_{2} \mid x_{1}\right)\). (c) \(E\left[u\left(X_{1}\right)\right]\). Compare the answers of parts (a) and (c). Hint: Note that \(E\left(X_{2}\right)=\sum_{x_{1}=1}^{5} \sum_{x_{2}=0}^{x_{1}} x_{2} p\left(x_{1}, x_{2}\right)\)
Let \(X\) and \(Y\) have the joint pmf \(p(x, y)=e^{-2} /[x !(y-x) !], y=0,1,2, \ldots\), \(x=0,1, \ldots, y\), zero elsewhere. (a) Find the mgf \(M\left(t_{1}, t_{2}\right)\) of this joint distribution. (b) Compute the means, the variances, and the correlation coefficient of \(X\) and \(Y\). (c) Determine the conditional mean \(E(X \mid y)\). Hint: Note that $$ \sum_{x=0}^{y}\left[\exp \left(t_{1} x\right)\right] y ! /[x !(y-x) !]=\left[1+\exp \left(t_{1}\right)\right]^{y} $$ Why?
Over the years, the percentage of candidates passing an entrance exam to a prestigious law school is \(20 \%\). At one of the testing centers, a group of 50 candidates take the exam and 20 pass. Is this odd? Answer on the basis that \(X \geq 20\) where \(X\) is the number that pass in a group of 50 when the probability of a pass is \(0.2\).
What do you think about this solution?
We value your feedback to improve our textbook solutions.