Chapter 9: Problem 4
Show that the square of a noncentral \(T\) random variable is a noncentral \(F\) random variable.
Chapter 9: Problem 4
Show that the square of a noncentral \(T\) random variable is a noncentral \(F\) random variable.
All the tools & learning materials you need for study success - in one app.
Get started for freeStudent's scores on the mathematics portion of the ACT examination, \(x\), and on the final examination in the first-semester calculus ( 200 points possible), \(y\), are given. (a) Calculate the least squares regression line for these data. (b) Plot the points and the least squares regression line on the same graph. (c) Find point estimates for \(\alpha, \beta\), and \(\sigma^{2}\). (d) Find 95 percent confidence intervals for \(\alpha\) and \(\beta\) under the usual assumptions. $$ \begin{array}{cc|cc} \hline \mathrm{x} & \mathrm{y} & \mathrm{x} & \mathrm{y} \\ \hline 25 & 138 & 20 & 100 \\ 20 & 84 & 25 & 143 \\ 26 & 104 & 26 & 141 \\ 26 & 112 & 28 & 161 \\ 28 & 88 & 25 & 124 \\ 28 & 132 & 31 & 118 \\ 29 & 90 & 30 & 168 \\ 32 & 183 & & \\ \hline \end{array} $$
With the background of the two-way classification with \(c>1\) observations per cell, show that the maximum likelihood estimators of the parameters are $$ \begin{aligned} \hat{\alpha}_{i} &=\bar{X}_{i . .}-\bar{X}_{\ldots} \\ \hat{\beta}_{j} &=\bar{X}_{. j .}-\bar{X}_{\cdots} \\ \hat{\gamma}_{i j} &=\bar{X}_{i j .}-\bar{X}_{i .}-\bar{X}_{. j}+\bar{X}_{\ldots} \\ \hat{\mu} &=\bar{X}_{\ldots} \end{aligned} $$ Show that these are unbiased estimators of the respective parameters. Compute the variance of each estimator.
If \(A_{1}, A_{2}, \ldots, A_{k}\) are events, prove, by induction, Boole's inequality $$ P\left(A_{1} \cup A_{2} \cup \cdots \cup A_{k}\right) \leq \sum_{1}^{k} P\left(A_{i}\right) $$ Then show that $$ P\left(A_{1}^{c} \cap A_{2}^{c} \cap \cdots \cap A_{k}^{c}\right) \geq 1-\sum_{1}^{b} P\left(A_{i}\right) $$
Using the notation of Section 9.2, assume that the means \(\mu_{j}\) satisfy a linear function of \(j\), nanely \(\mu_{j}=c+d[j-(b+1) / 2] .\) Let independent random samples of size \(a\) be taken from the \(b\) normal distributions having means \(\mu_{1}, \mu_{2}, \ldots, \mu_{b}\), respectively, and common unknown variance \(\sigma^{2}\). (a) Show that the maximum likelihood estimators of \(c\) and \(d\) are, respectively, \(\hat{c}=\bar{X}_{. .}\) and $$ \hat{d}=\frac{\sum_{j=1}^{b}[j-(b-1) / 2]\left(\bar{X}_{. j}-\bar{X}_{. .}\right)}{\sum_{j=1}^{b}[j-(b+1) / 2]^{2}} $$ (b) Show that $$ \begin{aligned} \sum_{i=1}^{a} \sum_{j=1}^{b}\left(X_{i j}-\bar{X}_{. .}\right)^{2}=\sum_{i=1}^{a} \sum_{j=1}^{b}\left[X_{i j}-\bar{X}_{. .}-\hat{d}\left(j-\frac{b+1}{2}\right)\right]^{2} \\ &+\hat{d}^{2} \sum_{j=1}^{b} a\left(j-\frac{b+1}{2}\right)^{2} \end{aligned} $$ (c) Argue that the two terms in the right-hand member of Part (b), once divided by \(\sigma^{2}\), are independent random variables with \(\chi^{2}\) distributions provided that \(d=0\) (d) What \(F\) -statistic would be used to test the equality of the means, that is, \(H_{0}: d=0 ?\)
Often in regression the mean of the random variable \(Y\) is a linear function of \(p\) -values \(x_{1}, x_{2}, \ldots, x_{p}\), say \(\beta_{1} x_{1}+\beta_{2} x_{2}+\cdots+\beta_{p} x_{p}\), where \(\boldsymbol{\beta}^{\prime}=\left(\beta_{1}, \beta_{2}, \ldots, \beta_{p}\right)\) are the regression coefficients. Suppose that \(n\) values, \(\boldsymbol{Y}^{\prime}=\left(Y_{1}, Y_{2}, \ldots, Y_{n}\right)\) are observed for the \(x\) -values in \(\boldsymbol{X}=\left[x_{i j}\right]\), where \(\boldsymbol{X}\) is an \(n \times p\) design matrix and its ith row is associated with \(Y_{i}, i=1,2, \ldots, n .\) Assume that \(Y\) is multivariate normal with mean \(\boldsymbol{X} \boldsymbol{\beta}\) and variance-covariance matrix \(\sigma^{2} \boldsymbol{I}\), where \(\boldsymbol{I}\) is the \(n \times n\) identity matrix. (a) Note that \(Y_{1}, Y_{2}, \ldots, Y_{n}\) are independent. Why? (b) Since \(\boldsymbol{Y}\) should approximately equal its mean \(\boldsymbol{X} \boldsymbol{\beta}\), we estimate \(\boldsymbol{\beta}\) by solving the normal equations \(\boldsymbol{X}^{\prime} \boldsymbol{Y}=\boldsymbol{X}^{\prime} \boldsymbol{X} \boldsymbol{\beta}\) for \(\boldsymbol{\beta}\). Assuming that \(\boldsymbol{X}^{\prime} \boldsymbol{X}\) is non- singular, solve the equations to get \(\hat{\boldsymbol{\beta}}=\left(\boldsymbol{X}^{\prime} \boldsymbol{X}\right)^{-1} \boldsymbol{X}^{\prime} \boldsymbol{Y}\). Show that \(\hat{\boldsymbol{\beta}}\) has a multivariate normal distribution with mean \(\boldsymbol{\beta}\) and variance-covariance matrix $$ \sigma^{2}\left(\boldsymbol{X}^{\prime} \boldsymbol{X}\right)^{-1} $$ (c) Show that $$ (\boldsymbol{Y}-\boldsymbol{X} \boldsymbol{\beta})^{\prime}(\boldsymbol{Y}-\boldsymbol{X} \boldsymbol{\beta})=(\hat{\boldsymbol{\beta}}-\boldsymbol{\beta})^{\prime}\left(\boldsymbol{X}^{\prime} \boldsymbol{X}\right)(\hat{\boldsymbol{\beta}}-\boldsymbol{\beta})+(\boldsymbol{Y}-\boldsymbol{X} \hat{\boldsymbol{\beta}})^{\prime}(\boldsymbol{Y}-\boldsymbol{X} \hat{\boldsymbol{\beta}}) $$ say \(Q=Q_{1}+Q_{2}\) for convenience. (d) Show that \(Q_{1} / \sigma^{2}\) is \(\chi^{2}(p)\). (e) Show that \(Q_{1}\) and \(Q_{2}\) are independent. (f) Argue that \(Q_{2} / \sigma^{2}\) is \(\chi^{2}(n-p)\). (g) Find \(c\) so that \(c Q_{1} / Q_{2}\) has an \(F\) -distribution. (h) The fact that a value \(d\) can be found so that \(P\left(c Q_{1} / Q_{2} \leq d\right)=1-\alpha\) could be used to find a \(100(1-\alpha)\) percent confidence ellipsoid for \(\beta\). Explain.
What do you think about this solution?
We value your feedback to improve our textbook solutions.