Chapter 9: Problem 12
Fit \(y=a+x\) to the data $$ \begin{array}{l|lll} \mathrm{x} & 0 & 1 & 2 \\ \hline \mathrm{y} & 1 & 3 & 4 \end{array} $$ by the method of least squares.
Chapter 9: Problem 12
Fit \(y=a+x\) to the data $$ \begin{array}{l|lll} \mathrm{x} & 0 & 1 & 2 \\ \hline \mathrm{y} & 1 & 3 & 4 \end{array} $$ by the method of least squares.
All the tools & learning materials you need for study success - in one app.
Get started for freeLet \(\boldsymbol{X}^{\prime}=\left[X_{1}, X_{2}, \ldots, X_{n}\right]\), where \(X_{1}, X_{2}, \ldots, X_{n}\) are observations of a random sample from a distribution which is \(N\left(0, \sigma^{2}\right) .\) Let \(b^{\prime}=\left[b_{1}, b_{2}, \ldots, b_{n}\right]\) be a real nonzero vector, and let \(\boldsymbol{A}\) be a real symmetric matrix of order \(n\). Prove that the linear form \(\boldsymbol{b}^{\prime} \boldsymbol{X}\) and the quadratic form \(\boldsymbol{X}^{\prime} \boldsymbol{A} \boldsymbol{X}\) are independent if and only if \(\boldsymbol{b}^{\prime} \boldsymbol{A}=\mathbf{0}\). Use this fact to prove that \(\boldsymbol{b}^{\prime} \boldsymbol{X}\) and \(\boldsymbol{X}^{\prime} \boldsymbol{A} \boldsymbol{X}\) are independent if and only if the two quadratic forms, \(\left(\boldsymbol{b}^{\prime} \boldsymbol{X}\right)^{2}=\boldsymbol{X}^{\prime} \boldsymbol{b} \boldsymbol{b}^{\prime} \boldsymbol{X}\) and \(\boldsymbol{X}^{\prime} \boldsymbol{A} \boldsymbol{X}\) are independent.
Suppose \(\mathbf{A}\) is a real symmetric matrix. If the eigenvalues of \(\mathbf{A}\) are only 0 's and 1 's then prove that \(\mathbf{A}\) is idempotent.
Let \(X_{1}\) and \(X_{2}\) be two independent random variables. Let \(X_{1}\) and
\(Y=\) \(X_{1}+X_{2}\) be \(\chi^{2}\left(r_{1}, \theta_{1}\right)\) and
\(\chi^{2}(r, \theta)\), respectively. Here \(r_{1}
Let the \(4 \times 1\) matrix \(\boldsymbol{Y}\) be multivariate normal \(N\left(\boldsymbol{X} \boldsymbol{\beta}, \sigma^{2} \boldsymbol{I}\right)\), where the \(4 \times 3\) matrix \(\boldsymbol{X}\) equals $$ \boldsymbol{X}=\left[\begin{array}{rrr} 1 & 1 & 2 \\ 1 & -1 & 2 \\ 1 & 0 & -3 \\ 1 & 0 & -1 \end{array}\right] $$ and \(\beta\) is the \(3 \times 1\) regression coeffient matrix. (a) Find the mean matrix and the covariance matrix of \(\hat{\boldsymbol{\beta}}=\left(\boldsymbol{X}^{\prime} \boldsymbol{X}\right)^{-1} \boldsymbol{X}^{\prime} \boldsymbol{Y}\). (b) If we observe \(\boldsymbol{Y}^{\prime}\) to be equal to \((6,1,11,3)\), compute \(\hat{\boldsymbol{\beta}}\).
Let \(Q=X_{1} X_{2}-X_{3} X_{4}\), where \(X_{1}, X_{2}, X_{3}, X_{4}\) is a random sample of size 4 from a distribution which is \(N\left(0, \sigma^{2}\right) .\) Show that \(Q / \sigma^{2}\) does not have a chi-square distribution. Find the mgf of \(Q / \sigma^{2}\).
What do you think about this solution?
We value your feedback to improve our textbook solutions.