Chapter 9: Problem 4
Show that the square of a noncentral \(T\) random variable is a noncentral \(F\) random variable.
Chapter 9: Problem 4
Show that the square of a noncentral \(T\) random variable is a noncentral \(F\) random variable.
All the tools & learning materials you need for study success - in one app.
Get started for freeLet \(X_{1}, X_{2}, \ldots, X_{n}\) denote a random sample of size \(n\) from a distribution which is \(N\left(0, \sigma^{2}\right)\). Prove that \(\sum_{1}^{n} X_{i}^{2}\) and every quadratic form, which is nonidentically zero in \(X_{1}, X_{2}, \ldots, X_{n}\), are dependent.
Let \(\mathbf{X}^{\prime}=\left[X_{1}, X_{2}\right]\) be bivariate normal with matrix of means \(\boldsymbol{\mu}^{\prime}=\left[\mu_{1}, \mu_{2}\right]\) and positive definite covariance matrix \(\mathbf{\Sigma}\). Let $$ Q_{1}=\frac{X_{1}^{2}}{\sigma_{1}^{2}\left(1-\rho^{2}\right)}-2 \rho \frac{X_{1} X_{2}}{\sigma_{1} \sigma_{2}\left(1-\rho^{2}\right)}+\frac{X_{2}^{2}}{\sigma_{2}^{2}\left(1-\rho^{2}\right)} $$ Show that \(Q_{1}\) is \(\chi^{2}(r, \theta)\) and find \(r\) and \(\theta\). When and only when does \(Q_{1}\) have a central chi-square distribution?
Often in regression the mean of the random variable \(Y\) is a linear function of \(p\) -values \(x_{1}, x_{2}, \ldots, x_{p}\), say \(\beta_{1} x_{1}+\beta_{2} x_{2}+\cdots+\beta_{p} x_{p}\), where \(\boldsymbol{\beta}^{\prime}=\left(\beta_{1}, \beta_{2}, \ldots, \beta_{p}\right)\) are the regression coefficients. Suppose that \(n\) values, \(\boldsymbol{Y}^{\prime}=\left(Y_{1}, Y_{2}, \ldots, Y_{n}\right)\) are observed for the \(x\) -values in \(\boldsymbol{X}=\left[x_{i j}\right]\), where \(\boldsymbol{X}\) is an \(n \times p\) design matrix and its ith row is associated with \(Y_{i}, i=1,2, \ldots, n .\) Assume that \(Y\) is multivariate normal with mean \(\boldsymbol{X} \boldsymbol{\beta}\) and variance-covariance matrix \(\sigma^{2} \boldsymbol{I}\), where \(\boldsymbol{I}\) is the \(n \times n\) identity matrix. (a) Note that \(Y_{1}, Y_{2}, \ldots, Y_{n}\) are independent. Why? (b) Since \(\boldsymbol{Y}\) should approximately equal its mean \(\boldsymbol{X} \boldsymbol{\beta}\), we estimate \(\boldsymbol{\beta}\) by solving the normal equations \(\boldsymbol{X}^{\prime} \boldsymbol{Y}=\boldsymbol{X}^{\prime} \boldsymbol{X} \boldsymbol{\beta}\) for \(\boldsymbol{\beta}\). Assuming that \(\boldsymbol{X}^{\prime} \boldsymbol{X}\) is non- singular, solve the equations to get \(\hat{\boldsymbol{\beta}}=\left(\boldsymbol{X}^{\prime} \boldsymbol{X}\right)^{-1} \boldsymbol{X}^{\prime} \boldsymbol{Y}\). Show that \(\hat{\boldsymbol{\beta}}\) has a multivariate normal distribution with mean \(\boldsymbol{\beta}\) and variance-covariance matrix $$ \sigma^{2}\left(\boldsymbol{X}^{\prime} \boldsymbol{X}\right)^{-1} $$ (c) Show that $$ (\boldsymbol{Y}-\boldsymbol{X} \boldsymbol{\beta})^{\prime}(\boldsymbol{Y}-\boldsymbol{X} \boldsymbol{\beta})=(\hat{\boldsymbol{\beta}}-\boldsymbol{\beta})^{\prime}\left(\boldsymbol{X}^{\prime} \boldsymbol{X}\right)(\hat{\boldsymbol{\beta}}-\boldsymbol{\beta})+(\boldsymbol{Y}-\boldsymbol{X} \hat{\boldsymbol{\beta}})^{\prime}(\boldsymbol{Y}-\boldsymbol{X} \hat{\boldsymbol{\beta}}) $$ say \(Q=Q_{1}+Q_{2}\) for convenience. (d) Show that \(Q_{1} / \sigma^{2}\) is \(\chi^{2}(p)\). (e) Show that \(Q_{1}\) and \(Q_{2}\) are independent. (f) Argue that \(Q_{2} / \sigma^{2}\) is \(\chi^{2}(n-p)\). (g) Find \(c\) so that \(c Q_{1} / Q_{2}\) has an \(F\) -distribution. (h) The fact that a value \(d\) can be found so that \(P\left(c Q_{1} / Q_{2} \leq d\right)=1-\alpha\) could be used to find a \(100(1-\alpha)\) percent confidence ellipsoid for \(\beta\). Explain.
Here \(Q_{1}\) and \(Q_{2}\) are quadratic forms in observations of a random sample from \(N(0,1) .\) If \(Q_{1}\) and \(Q_{2}\) are independent and if \(Q_{1}+Q_{2}\) has a chi-square distribution, prove that \(Q_{1}\) and \(Q_{2}\) are chi-square variables.
Let the independent random variables \(Y_{1}, Y_{2}, \ldots, Y_{n}\) have, respectively, the probability density functions \(N\left(\beta x_{i}, \gamma^{2} x_{i}^{2}\right), i=1,2, \ldots, n\), where the given numbers \(x_{1}, x_{2}, \ldots, x_{n}\) are not all equal and no one is zero. Find the maximum likelihood estimators of \(\beta\) and \(\gamma^{2}\).
What do you think about this solution?
We value your feedback to improve our textbook solutions.