Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

In Exercises 17 and 18, \(A\) is an \(m \times n\) matrix and \(b\) is in \({\mathbb{R}^m}\). Mark each statement True or False. Justify each answer.

18.

  1. If \(b\) is in the column space of \(A\), then every solution of \(Ax = b\) is a least squares solution.
  2. The least-squares solution of \(Ax = b\) is the point in the column space of \(A\) closest to \(b\).
  3. A least-square solution of \(Ax = b\) is a list of weights that, when applied to the columns of \(A\), produces the orthogonal projection of \(b\) onto \({\rm{Col}}\,A\).
  4. If \(\hat x\) is a least-squares solution of \(Ax = b\), then \(\hat x = {\left( {{A^T}A} \right)^{ - 1}}{A^T}b\).
  5. The normal equations always provide a reliable method for computing least-squares solutions.
  6. If \(A\) has a \(QR\) factorization, say \(A = QR\), then the best way to find the least-squares solution of \(Ax = b\) is to compute \(\hat x = {R^{ - 1}}{Q^T}b\).

Short Answer

Expert verified
  1. The given statement is True.
  2. The given statement is False.
  3. The given statement is True.
  4. The given statement is False.
  5. The given statement is False.
  6. The given statement is False.

Step by step solution

01

Statement (a)

When \(b\) is in the column space of \(A\), then \(b = Ax\) for some \(x\) is the least-square solution.

02

Statement (b)

If \(x\) is the least-square solution, then \(Ax\) is the point in the column space of closest to \(b\).

Thus, the statement is false.

03

 Statement (c)

If \(b\) is the orthogonal projection of \(b\) onto \({\rm{Col}}\,A\), that is \(Ax = b\), then the system \(Ax = b\) is consistent.

By the definition of orthogonal decomposition, the projection \(\hat b\) has the property that \(b - \hat b\)that is orthogonal to \({\rm{Col}}\,A\).

Hence, the statement is true.

04

Statement (d)

The matrix \({A^T}A\) is invertible if and only if the columns of \(A\) are linearly independent.

Thus, the equation \(Ax = b\) has only one least-square solution \(x = {\left( {{A^T}A} \right)^{ - 1}}{A^T}b\).

Hence, the statement is false.

05

Statement (e)

In the procedure of normal equations for a least-square solution, small errors in the calculation of \({A^T}A\) can lead to relatively large errors in the solution \(x\).

If the columns of\(A\)are linearly independent, the least squares solution can often be computed more reliably through\(QR\)factorization of\(A\).

Hence, the statement is false.

06

Statement (f)

When an\(m \times n\)matrix has linearly independent columns and\(M\)is the\(QR\)factorization of\(A\), then\(Ax = b\)has a unique solution for each\(b\)in\({R^m}\).


Since \(R\) is an upper triangular matrix, so \(x\) can be computed from \(Rx = {Q^T}b\).

Hence, the statement is false.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

In Exercises 17 and 18, all vectors and subspaces are in \({\mathbb{R}^n}\). Mark each statement True or False. Justify each answer.

17. a.If \(\left\{ {{{\bf{v}}_1},{{\bf{v}}_2},{{\bf{v}}_3}} \right\}\) is an orthogonal basis for\(W\), then multiplying

\({v_3}\)by a scalar \(c\) gives a new orthogonal basis \(\left\{ {{{\bf{v}}_1},{{\bf{v}}_2},c{{\bf{v}}_3}} \right\}\).

b. The Gramโ€“Schmidt process produces from a linearly independent

set \(\left\{ {{{\bf{x}}_1}, \ldots ,{{\bf{x}}_p}} \right\}\)an orthogonal set \(\left\{ {{{\bf{v}}_1}, \ldots ,{{\bf{v}}_p}} \right\}\) with the property that for each \(k\), the vectors \({{\bf{v}}_1}, \ldots ,{{\bf{v}}_k}\) span the same subspace as that spanned by \({{\bf{x}}_1}, \ldots ,{{\bf{x}}_k}\).

c. If \(A = QR\), where \(Q\) has orthonormal columns, then \(R = {Q^T}A\).

In Exercises 11 and 12, find the closest point to \[{\bf{y}}\] in the subspace \[W\] spanned by \[{{\bf{v}}_1}\], and \[{{\bf{v}}_2}\].

12. \[y = \left[ {\begin{aligned}3\\{ - 1}\\1\\{13}\end{aligned}} \right]\], \[{{\bf{v}}_1} = \left[ {\begin{aligned}1\\{ - 2}\\{ - 1}\\2\end{aligned}} \right]\], \[{{\bf{v}}_2} = \left[ {\begin{aligned}{ - 4}\\1\\0\\3\end{aligned}} \right]\]

Find an orthogonal basis for the column space of each matrix in Exercises 9-12.

11. \(\left( {\begin{aligned}{{}{}}1&2&5\\{ - 1}&1&{ - 4}\\{ - 1}&4&{ - 3}\\1&{ - 4}&7\\1&2&1\end{aligned}} \right)\)

In Exercises 9-12 find (a) the orthogonal projection of b onto \({\bf{Col}}A\) and (b) a least-squares solution of \(A{\bf{x}} = {\bf{b}}\).

12. \(A = \left[ {\begin{array}{{}{}}{\bf{1}}&{\bf{1}}&{\bf{0}}\\{\bf{1}}&{\bf{0}}&{ - {\bf{1}}}\\{\bf{0}}&{\bf{1}}&{\bf{1}}\\{ - {\bf{1}}}&{\bf{1}}&{ - {\bf{1}}}\end{array}} \right]\), \({\bf{b}} = \left( {\begin{array}{{}{}}{\bf{2}}\\{\bf{5}}\\{\bf{6}}\\{\bf{6}}\end{array}} \right)\)

Compute the quantities in Exercises 1-8 using the vectors

\({\mathop{\rm u}\nolimits} = \left( {\begin{aligned}{*{20}{c}}{ - 1}\\2\end{aligned}} \right),{\rm{ }}{\mathop{\rm v}\nolimits} = \left( {\begin{aligned}{*{20}{c}}4\\6\end{aligned}} \right),{\rm{ }}{\mathop{\rm w}\nolimits} = \left( {\begin{aligned}{*{20}{c}}3\\{ - 1}\\{ - 5}\end{aligned}} \right),{\rm{ }}{\mathop{\rm x}\nolimits} = \left( {\begin{aligned}{*{20}{c}}6\\{ - 2}\\3\end{aligned}} \right)\)

7. \(\left\| {\mathop{\rm w}\nolimits} \right\|\)

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free