Chapter 9: Problem 13
Fit by the method of least squares the plane \(z=a+b x+c y\) to the five points \((x, y, z):(-1,-2,5),(0,-2,4),(0,0,4),(1,0,2),(2,1,0)\).
Chapter 9: Problem 13
Fit by the method of least squares the plane \(z=a+b x+c y\) to the five points \((x, y, z):(-1,-2,5),(0,-2,4),(0,0,4),(1,0,2),(2,1,0)\).
All the tools & learning materials you need for study success - in one app.
Get started for freeSuppose \(\mathbf{A}\) is a real symmetric matrix. If the eigenvalues of \(\mathbf{A}\) are only 0 's and 1 's then prove that \(\mathbf{A}\) is idempotent.
Let \(\boldsymbol{X}^{\prime}=\left[X_{1}, X_{2}, \ldots, X_{n}\right]\), where \(X_{1}, X_{2}, \ldots, X_{n}\) are observations of a random sample from a distribution which is \(N\left(0, \sigma^{2}\right) .\) Let \(b^{\prime}=\left[b_{1}, b_{2}, \ldots, b_{n}\right]\) be a real nonzero vector, and let \(\boldsymbol{A}\) be a real symmetric matrix of order \(n\). Prove that the linear form \(\boldsymbol{b}^{\prime} \boldsymbol{X}\) and the quadratic form \(\boldsymbol{X}^{\prime} \boldsymbol{A} \boldsymbol{X}\) are independent if and only if \(\boldsymbol{b}^{\prime} \boldsymbol{A}=\mathbf{0}\). Use this fact to prove that \(\boldsymbol{b}^{\prime} \boldsymbol{X}\) and \(\boldsymbol{X}^{\prime} \boldsymbol{A} \boldsymbol{X}\) are independent if and only if the two quadratic forms, \(\left(\boldsymbol{b}^{\prime} \boldsymbol{X}\right)^{2}=\boldsymbol{X}^{\prime} \boldsymbol{b} \boldsymbol{b}^{\prime} \boldsymbol{X}\) and \(\boldsymbol{X}^{\prime} \boldsymbol{A} \boldsymbol{X}\) are independent.
Let the independent normal random variables \(Y_{1}, Y_{2}, \ldots, Y_{n}\) have, respectively, the probability density functions \(N\left(\mu, \gamma^{2} x_{i}^{2}\right), i=1,2, \ldots, n\), where the given \(x_{1}, x_{2}, \ldots, x_{n}\) are not all equal and no one of which is zero. Discuss the test of the hypothesis \(H_{0}: \gamma=1, \mu\) unspecified, against all alternatives \(H_{1}: \gamma \neq 1, \mu\) unspecified.
Let \(\boldsymbol{A}_{1}, \boldsymbol{A}_{2}, \ldots, \boldsymbol{A}_{k}\) be the matrices of \(k>2\) quadratic forms \(Q_{1}, Q_{2}, \ldots, Q_{k}\) in the observations of a random sample of size \(n\) from a distribution which is \(N\left(0, \sigma^{2}\right)\). Prove that the pairwise independence of these forms implies that they are mutually independent. Hint: Show that \(A_{i} A_{j}=0, i \neq j\), permits \(E\left[\exp \left(t_{1} Q_{1}+t_{2} Q_{2}+\cdots t_{k} Q_{k}\right)\right]\) to be written as a product of the mgfs of \(Q_{1}, Q_{2}, \ldots, Q_{k}\)
Suppose \(\boldsymbol{Y}\) is an \(n \times 1\) random vector, \(\boldsymbol{X}\) is an \(n \times p\) matrix of known constants of rank \(p\), and \(\beta\) is a \(p \times 1\) vector of regression coefficients. Let \(\boldsymbol{Y}\) have a \(N\left(\boldsymbol{X} \boldsymbol{\beta}, \sigma^{2} \boldsymbol{I}\right)\) distribution. Discuss the joint pdf of \(\hat{\boldsymbol{\beta}}=\left(\boldsymbol{X}^{\prime} \boldsymbol{X}\right)^{-1} \boldsymbol{X}^{\prime} \boldsymbol{Y}\) and \(\boldsymbol{Y}^{\prime}\left[\boldsymbol{I}-\boldsymbol{X}\left(\boldsymbol{X}^{\prime} \boldsymbol{X}\right)^{-1} \boldsymbol{X}^{\prime}\right] \boldsymbol{Y} / \sigma^{2}\)
What do you think about this solution?
We value your feedback to improve our textbook solutions.