Chapter 8: Problem 9
(a) Consider a normal linear model \(y=X \beta+\varepsilon\) where \(\operatorname{var}(\varepsilon)=\sigma^{2} W^{-1}\), and \(W\) is a known positive definite symmetric matrix. Show that a inverse square root matrix \(W^{1 / 2}\) exists, and re-express the least squares problem in terms of \(y_{11}=W^{1 / 2} y, X_{1}=W^{1 / 2} X\), and \(\varepsilon_{1}=W^{1 / 2} \varepsilon .\) Show that \(\operatorname{var}\left(\varepsilon_{1}\right)=\sigma^{2} I_{n} .\) Hence find the least squares estimates, hat matrix, and residual sum of squares for the weighted regression in terms of \(y, X\), and \(W\), and give the distributions of the least squares estimates of \(\beta\) and the residual sum of squares. (b) Suppose that \(W\) depends on an unknown scalar parameter, \(\rho\). Find the profile log likelihood for \(\rho, \ell_{\mathrm{p}}(\rho)=\max _{\beta, \sigma^{2}} \ell\left(\beta, \sigma^{2}, \rho\right)\), and outline how to use a least squares package to give a confidence interval for \(\rho\).
Short Answer
Step by step solution
Key Concepts
These are the key concepts you need to understand to accurately answer the question.