Chapter 8: Problem 9
Consider the linear least squares problem of minimizing \(\|\mathbf{b}-A \mathbf{x}\|_{2}\), where \(A\) is an \(m \times n\) \((m>n)\) matrix of \(\operatorname{rank} n .\) (a) Use the SVD to show that \(A^{T} A\) is nonsingular. (b) Given an \(m \times n\) matrix \(A\) that has full column rank, show that \(A\left(A^{T} A\right)^{-1} A^{T}\) is a projector which is also symmetric. Such operators are known as orthogonal projectors. (c) Show the solution of the linear least squares problem satisfies $$ \mathbf{r}=\mathbf{b}-A \mathbf{x}=P \mathbf{b} $$ where \(P\) is an orthogonal projector. Express the projector \(P\) in terms of \(A\). (d) Let \(Q\) and \(R\) be the matrices associated with the QR decomposition of \(A\). Express the matrix \(P\) in terms of \(Q\) and \(R\). Simplify your result as much as possible. (e) With \(\mathbf{r}\) defined as usual as the residual, consider replacing \(\mathbf{b}\) by \(\hat{\mathbf{b}}=\mathbf{b}+\alpha \mathbf{r}\) for some scalar \(\alpha\). Show that we will get the same least squares solution to \(\min _{\mathbf{x}}\|A \mathbf{x}-\hat{\mathbf{b}}\|_{2}\) regardless of the value of \(\alpha\).
Short Answer
Step by step solution
Key Concepts
These are the key concepts you need to understand to accurately answer the question.