Chapter 9: Problem 4
Show that the square of a noncentral \(T\) random variable is a noncentral \(F\) random variable.
Chapter 9: Problem 4
Show that the square of a noncentral \(T\) random variable is a noncentral \(F\) random variable.
All the tools & learning materials you need for study success - in one app.
Get started for freeLet the independent random variables \(Y_{1}, \ldots, Y_{n}\) have the joint pdf. $$ L\left(\alpha, \beta, \sigma^{2}\right)=\left(\frac{1}{2 \pi \sigma^{2}}\right)^{n / 2} \exp \left\\{-\frac{1}{2 \sigma^{2}} \sum_{1}^{n}\left[y_{i}-\alpha-\beta\left(x_{i}-\bar{x}\right)\right]^{2}\right\\} $$ where the given numbers \(x_{1}, x_{2}, \ldots, x_{n}\) are not all equal. Let \(H_{0}: \beta=0(\alpha\) and \(\sigma^{2}\) unspecified). It is desired to use a likelihood ratio test to test \(H_{0}\) against all possible alternatives. Find \(\Lambda\) and see whether the test can be based on a familiar statistic. Hint: In the notation of this section show that $$ \sum_{1}^{n}\left(Y_{i}-\hat{\alpha}\right)^{2}=Q_{3}+\widehat{\beta}^{2} \sum_{1}^{n}\left(x_{i}-\bar{x}\right)^{2} $$
Suppose \(\mathbf{A}\) is a real symmetric matrix. If the eigenvalues of \(\mathbf{A}\) are only 0 's and 1 's then prove that \(\mathbf{A}\) is idempotent.
Given the following observations associated with a two-way classification with \(a=3\) and \(b=4\), compute the \(F\) -statistic used to test the equality of the column means \(\left(\beta_{1}=\beta_{2}=\beta_{3}=\beta_{4}=0\right)\) and the equality of the row means \(\left(\alpha_{1}=\alpha_{2}=\alpha_{3}=0\right)\), respectively. $$ \begin{array}{ccccc} \hline \text { Row/Column } & 1 & 2 & 3 & 4 \\ \hline 1 & 3.1 & 4.2 & 2.7 & 4.9 \\ 2 & 2.7 & 2.9 & 1.8 & 3.0 \\ 3 & 4.0 & 4.6 & 3.0 & 3.9 \\ \hline \end{array} $$
Using the notation of Section 9.2, assume that the means \(\mu_{j}\) satisfy a linear function of \(j\), nanely \(\mu_{j}=c+d[j-(b+1) / 2] .\) Let independent random samples of size \(a\) be taken from the \(b\) normal distributions having means \(\mu_{1}, \mu_{2}, \ldots, \mu_{b}\), respectively, and common unknown variance \(\sigma^{2}\). (a) Show that the maximum likelihood estimators of \(c\) and \(d\) are, respectively, \(\hat{c}=\bar{X}_{. .}\) and $$ \hat{d}=\frac{\sum_{j=1}^{b}[j-(b-1) / 2]\left(\bar{X}_{. j}-\bar{X}_{. .}\right)}{\sum_{j=1}^{b}[j-(b+1) / 2]^{2}} $$ (b) Show that $$ \begin{aligned} \sum_{i=1}^{a} \sum_{j=1}^{b}\left(X_{i j}-\bar{X}_{. .}\right)^{2}=\sum_{i=1}^{a} \sum_{j=1}^{b}\left[X_{i j}-\bar{X}_{. .}-\hat{d}\left(j-\frac{b+1}{2}\right)\right]^{2} \\ &+\hat{d}^{2} \sum_{j=1}^{b} a\left(j-\frac{b+1}{2}\right)^{2} \end{aligned} $$ (c) Argue that the two terms in the right-hand member of Part (b), once divided by \(\sigma^{2}\), are independent random variables with \(\chi^{2}\) distributions provided that \(d=0\) (d) What \(F\) -statistic would be used to test the equality of the means, that is, \(H_{0}: d=0 ?\)
Let \(\mathbf{X}^{\prime}=\left[X_{1}, X_{2}\right]\) be bivariate normal with matrix of means \(\boldsymbol{\mu}^{\prime}=\left[\mu_{1}, \mu_{2}\right]\) and positive definite covariance matrix \(\mathbf{\Sigma}\). Let $$ Q_{1}=\frac{X_{1}^{2}}{\sigma_{1}^{2}\left(1-\rho^{2}\right)}-2 \rho \frac{X_{1} X_{2}}{\sigma_{1} \sigma_{2}\left(1-\rho^{2}\right)}+\frac{X_{2}^{2}}{\sigma_{2}^{2}\left(1-\rho^{2}\right)} $$ Show that \(Q_{1}\) is \(\chi^{2}(r, \theta)\) and find \(r\) and \(\theta\). When and only when does \(Q_{1}\) have a central chi-square distribution?
What do you think about this solution?
We value your feedback to improve our textbook solutions.