Problem 1
Show how to use inversion to generate Bernoulli random variables. If \(0<\pi<1\), what distribution has \(\sum_{j=1}^{m} I\left(U_{j} \leq \pi\right) ?\)
Problem 1
Suppose that \(Y_{1}, \ldots, Y_{4}\) are independent normal variables, each with variance \(\sigma^{2}\), but with means \(\mu+\alpha+\beta+\gamma, \mu+\alpha-\beta-\gamma, \mu-\alpha+\beta-\gamma, \mu-\alpha-\beta+\gamma\). Let \(Z^{\mathrm{T}}=\frac{1}{4}\left(Y_{1}+Y_{2}+Y_{3}+Y_{4}, Y_{1}+Y_{2}-Y_{3}-Y_{4}, Y_{1}-Y_{2}+Y_{3}-Y_{4}, Y_{1}-Y_{2}-Y_{3}+Y_{4}\right)\) Calculate the mean vector and covariance matrix of \(Z\), and give the joint distribution of \(Z_{1}\) and \(V=Z_{2}^{2}+Z_{3}^{2}+Z_{4}^{2}\) when \(\alpha=\beta=\gamma=0\). What is then the distribution of \(Z_{1} /(V / 3)^{1 / 2} ?\)
Problem 2
Let \(Y_{1}, \ldots, Y_{n}\) be defined by \(Y_{j}=\mu+\sigma X_{j}\), where \(X_{1}, \ldots, X_{n}\) is a random sample from a known density \(g\) with distribution function \(G\). If \(M=m(Y)\) and \(S=s(Y)\) are location and scale statistics based on \(Y_{1}, \ldots, Y_{n}\), that is, they have the properties that \(m(Y)=\mu+\sigma m(X)\) and \(s(Y)=\sigma s(X)\) for all \(X_{1}, \ldots, X_{n}, \sigma>0\) and real \(\mu\), then show that \(Z(\mu)=n^{1 / 2}(M-\mu) / S\) is a pivot. When \(n\) is odd and large, \(g\) is the standard normal density, \(M\) is the median of \(Y_{1}, \ldots, Y_{n}\) and \(S=\) IQR their interquartile range, show that \(S / 1.35 \stackrel{P}{\longrightarrow} \sigma\), and hence show that as \(n \rightarrow \infty, Z(\mu) \stackrel{D}{\longrightarrow} N\left(0, \tau^{2}\right)\), for known \(\tau>0 .\) Hence give the form of a \(95 \%\) confidence interval for \(\mu\). Compare this interval and that based on using \(Z(\mu)\) with \(M=\bar{Y}\) and \(S^{2}\) the sample variance, for the data for day 4 in Table \(2.1\).
Problem 2
\(W_{i}, X_{i}, Y_{i}\), and \(Z_{i}, i=1,2\), are eight independent, normal random variables with common variance \(\sigma^{2}\) and expectations \(\mu_{W}, \mu_{X}, \mu_{Y}\) and \(\mu_{Z} .\) Find the joint distribution of the random variables $$ \begin{aligned} T_{1} &=\frac{1}{2}\left(W_{1}+W_{2}\right)-\mu_{W}, T_{2}=\frac{1}{2}\left(X_{1}+X_{2}\right)-\mu_{X} \\ T_{3} &=\frac{1}{2}\left(Y_{1}+Y_{2}\right)-\mu_{Y}, T_{4}=\frac{1}{2}\left(Z_{1}+Z_{2}\right)-\mu_{Z} \\ T_{5} &=W_{1}-W_{2}, T_{6}=X_{1}-X_{2}, T_{7}=Y_{1}-Y_{2}, T_{8}=Z_{1}-Z_{2} \end{aligned} $$ Hence obtain the distribution of $$ U=4 \frac{T_{1}^{2}+T_{2}^{2}+T_{3}^{2}+T_{4}^{2}}{T_{5}^{2}+T_{6}^{2}+T_{7}^{2}+T_{8}^{2}} $$ Show that the random variables \(U /(1+U)\) and \(1 /(1+U)\) are identically distributed, without finding their probability density functions. Find their common density function and hence determine \(\operatorname{Pr}(U \leq 2)\).
Problem 3
The Cholesky decomposition of an \(p \times p\) symmetric positive matrix \(\Omega\) is the unique lower triangular \(p \times p\) matrix \(L\) such that \(L L^{\mathrm{T}}=\Omega\). Find the distribution of \(\mu+L Z\), where \(Z\) is a vector containing a standard normal random sample \(Z_{1}, \ldots, Z_{p}\), and hence give an algorithm to generate from the multivariate normal distribution.
Problem 4
If inversion can be used to generate a variable \(Y\) with distribution function
\(F\), discuss how to generate values from \(F\) conditioned on the events (a) \(Y
\leq y_{U}\), (b) \(y_{L}
Problem 5
If \(Z\) is standard normal, then \(Y=\exp (\mu+\sigma Z)\) is said to have the log-normal distribution. Show that \(\mathrm{E}\left(Y^{r}\right)=\exp (r \mu) M_{Z}(r \sigma)\) and hence give expressions for the mean and variance of \(Y\). Show that although all its moments are finite, \(Y\) does not have a moment- generating function.
Problem 5
One way to construct a confidence interval for a real parameter \(\theta\) is to take the interval \((-\infty, \infty)\) with probability \((1-2 \alpha)\), and otherwise take the empty set \(\emptyset\). Show that this procedure has exact coverage \((1-2 \alpha) .\) Is it a good procedure?
Problem 5
If \(X\) has density \(\lambda e^{-\lambda x}, x>0\), show that \(\operatorname{Pr}(r-1 \leq X \leq r)=e^{-\lambda(r-1)}\left(1-e^{-\lambda}\right)\) If \(Y\) has geometric density \(\operatorname{Pr}(Y=r)=\pi(1-\pi)^{r-1}\), for \(r=1,2, \ldots\) and \(0<\pi<1\), show that \(Y \stackrel{D}{=}\lceil\log U / \log (1-\pi)\rceil\). Hence give an algorithm to generate geometric variables.
Problem 6
A binomial variable \(R\) has mean \(m \pi\) and variance \(m \pi(1-\pi) .\) Find the variance function of \(Y=R / m\), and hence obtain the variance-stabilizing transform for \(R\).