Chapter 9: Problem 4
Show that the square of a noncentral \(T\) random variable is a noncentral \(F\) random variable.
Chapter 9: Problem 4
Show that the square of a noncentral \(T\) random variable is a noncentral \(F\) random variable.
All the tools & learning materials you need for study success - in one app.
Get started for freeLet \(\mathbf{A}=\left[a_{i j}\right]\) be a real symmetric matrix. Prove that \(\sum_{i} \sum_{j} a_{i j}^{2}\) is equal to the sum of the squares of the eigenvalues of \(\mathbf{A}\). Hint: If \(\boldsymbol{\Gamma}\) is an orthogonal matrix, show that \(\sum_{j} \sum_{i} a_{i j}^{2}=\operatorname{tr}\left(\mathbf{A}^{2}\right)=\operatorname{tr}\left(\mathbf{\Gamma}^{\prime} \mathbf{A}^{2} \mathbf{\Gamma}\right)=\) \(\operatorname{tr}\left[\left(\mathbf{\Gamma}^{\prime} \mathbf{A} \mathbf{\Gamma}\right)\left(\mathbf{\Gamma}^{\prime} \mathbf{A} \boldsymbol{\Gamma}\right)\right]\)
Let the independent normal random variables \(Y_{1}, Y_{2}, \ldots, Y_{n}\) have, respectively, the probability density functions \(N\left(\mu, \gamma^{2} x_{i}^{2}\right), i=1,2, \ldots, n\), where the given \(x_{1}, x_{2}, \ldots, x_{n}\) are not all equal and no one of which is zero. Discuss the test of the hypothesis \(H_{0}: \gamma=1, \mu\) unspecified, against all alternatives \(H_{1}: \gamma \neq 1, \mu\) unspecified.
Assume that the sample \(\left(x_{1}, Y_{1}\right), \ldots,\left(x_{n},
Y_{n}\right)\) follows the linear model \((9.6 .1)\). Suppose \(Y_{0}\) is a future
observation at \(x=x_{0}-\bar{x}\) and we want to determine a predictive
interval for it. Assume that the model \((9.6 .1)\) holds for \(Y_{0}\); i.e.,
\(Y_{0}\) has a \(N\left(\alpha+\beta\left(x_{0}-\bar{x}\right),
\sigma^{2}\right)\) distribution. We will use \(\hat{\eta}_{0}\) of Exercise \(9.6
.4\) as our prediction of \(Y_{0}\)
(a) Obtain the distribution of \(Y_{0}-\hat{\eta}_{0}\). Use the fact that the
future observation \(Y_{0}\) is independent of the sample \(\left(x_{1},
Y_{1}\right), \ldots,\left(x_{n}, Y_{n}\right)\)
(b) Determine a \(t\) -statistic with numerator \(Y_{0}-\hat{\eta}_{0}\).
(c) Now beginning with \(1-\alpha=P\left[-t_{\alpha / 2, n-2}
Let \(X_{1}, X_{2}, X_{3}, X_{4}\) be a random sample of size \(n=4\) from the normal distribution \(N(0,1) .\) Show that \(\sum_{i=1}^{4}\left(X_{i}-\bar{X}\right)^{2}\) equals $$ \frac{\left(X_{1}-X_{2}\right)^{2}}{2}+\frac{\left[X_{3}-\left(X_{1}+X_{2}\right) / 2\right]^{2}}{3 / 2}+\frac{\left[X_{4}-\left(X_{1}+X_{2}+X_{3}\right) / 3\right]^{2}}{4 / 3} $$ and argue that these three terms are independent, each with a chi-square distribution with 1 degree of freedom.
Let \(Y_{1}, Y_{2}, \ldots, Y_{n}\) be \(n\) independent normal variables with common unknown variance \(\sigma^{2}\). Let \(Y_{i}\) have mean \(\beta x_{i}, i=1,2, \ldots, n\), where \(x_{1}, x_{2}, \ldots, x_{n}\) are known but not all the same and \(\beta\) is an unknown constant. Find the likelihood ratio test for \(H_{0}: \beta=0\) against all alternatives. Show that this likelihood ratio test can be based on a statistic that has a well-known distribution.
What do you think about this solution?
We value your feedback to improve our textbook solutions.