Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

In Exercises 17–24, \(A\) is an \(m \times n\) matrix with a singular value decomposition \(A = U\Sigma {V^T}\) , where \(U\) is an \(m \times m\) orthogonal matrix, \({\bf{\Sigma }}\) is an \(m \times n\) “diagonal” matrix with \(r\) positive entries and no negative entries, and \(V\) is an \(n \times n\) orthogonal matrix. Justify each answer.

19. Show that the columns of\(V\)are eigenvectors of\({A^T}A\), the columns of\(U\)are eigenvectors of\(A{A^T}\), and the diagonal entries of\({\bf{\Sigma }}\)are the singular values of \(A\). (Hint: Use the SVD to compute \({A^T}A\) and \(A{A^T}\).)

Short Answer

Expert verified

It is verified that, \(V\) and \(U\) are eigenvectors of \({A^T}A\) and \(A{A^T}\) respectively, and the columns of \(U\) re eigenvectors of \(A{A^T}\) and the diagonal elements of \(\Sigma \) are the singular values of \(A\).

Step by step solution

01

Find the product of \({A^T}\) and \(A\)

As singular value decomposition of\(A\)is \(A = U\Sigma {V^T}\), then

\(\begin{array}{c}{A^T}A = {\left( {U\Sigma {V^T}} \right)^T}U\Sigma {V^T}\\ = V\mathop \Sigma \limits^T {U^T}\Sigma {V^T}\\ = V\left( {\mathop \Sigma \limits^T \Sigma } \right){V^T}\\ = V\left( {\mathop \Sigma \limits^T \Sigma {V^{ - 1}}} \right)\end{array}\)

Therefore, from the Diagonalization theorem. Singular values are the diagonal entries in \(\Sigma \) matrix. The columns of matrix \(V\) are the eigenvectors of \({A^T}A\).

02

Find the product of \(A\) and \({A^T}\)

Now, find the product of \(A = U\Sigma {V^T}\) and its transpose.

\(\begin{array}{c}A{A^T} = U\Sigma {V^T}{\left( {U\Sigma {V^T}} \right)^T}\\ = U\mathop \Sigma \limits^T {V^T}V\mathop \Sigma \limits^T {U^T}\\ = U\left( {\Sigma \mathop \Sigma \limits^T } \right){U^T}\\ = U\left( {\Sigma \mathop \Sigma \limits^T } \right){U^{ - 1}}\end{array}\)

Thus, it can be observed that\(U\)diagonalizes\(A{A^T}\)and the columns of\(U\)must be eigenvectors of\(A{A^T}\).

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

(M) Orhtogonally diagonalize the matrices in Exercises 37-40. To practice the methods of this section, do not use an eigenvector routine from your matrix program. Instead, use the program to find the eigenvalues, and for each eigenvalue \(\lambda \), find an orthogonal basis for \({\bf{Nul}}\left( {A - \lambda I} \right)\), as in Examples 2 and 3.

38. \(\left( {\begin{aligned}{{}}{.{\bf{63}}}&{ - .{\bf{18}}}&{ - .{\bf{06}}}&{ - .{\bf{04}}}\\{ - .{\bf{18}}}&{.{\bf{84}}}&{ - .{\bf{04}}}&{.{\bf{12}}}\\{ - .{\bf{06}}}&{ - .{\bf{04}}}&{.{\bf{72}}}&{ - .{\bf{12}}}\\{ - .{\bf{04}}}&{.{\bf{12}}}&{ - .{\bf{12}}}&{.{\bf{66}}}\end{aligned}} \right)\)

In Exercises 3-6, find (a) the maximum value of\(Q\left( {\rm{x}} \right)\)subject to the constraint\({{\rm{x}}^T}{\rm{x}} = 1\), (b) a unit vector\({\rm{u}}\)where this maximum is attained, and (c) the maximum of\(Q\left( {\rm{x}} \right)\)subject to the constraints\({{\rm{x}}^T}{\rm{x}} = 1{\rm{ and }}{{\rm{x}}^T}{\rm{u}} = 0\).

3.\(Q\left( x \right) = 5x_1^2 + 6x_2^2 + 7x_3^2 + 4x_1^{}x_2^{} - 4x_2^{}x_3^{}\).

Question: Let \({x_1}\,,{x_2}\) denote the variables for the two-dimensional data in Exercise 1. Find a new variable \({y_1}\) of the form \({y_1} = {c_1}{x_1} + {c_2}{x_2}\), with\(c_1^2 + c_2^2 = 1\), such that \({y_1}\) has maximum possible variance over the given data. How much of the variance in the data is explained by \({y_1}\)?

Question: In Exercises 15 and 16, construct the pseudo-inverse of \(A\). Begin by using a matrix program to produce the SVD of \(A\), or, if that is not available, begin with an orthogonal diagonalization of \({A^T}A\). Use the pseudo-inverse to solve \(A{\rm{x}} = {\rm{b}}\), for \({\rm{b}} = \left( {6, - 1, - 4,6} \right)\) and let \(\mathop {\rm{x}}\limits^\^ \)be the solution. Make a calculation to verify that \(\mathop {\rm{x}}\limits^\^ \) is in Row \(A\). Find a nonzero vector \({\rm{u}}\) in Nul\(A\), and verify that \(\left\| {\mathop {\rm{x}}\limits^\^ } \right\| < \left\| {\mathop {\rm{x}}\limits^\^ + {\rm{u}}} \right\|\), which must be true by Exercise 13(c).

16. \(A = \left( {\begin{array}{*{20}{c}}4&0&{ - 1}&{ - 2}&0\\{ - 5}&0&3&5&0\\{\,\,\,2}&{\,\,0}&{ - 1}&{ - 2}&0\\{\,\,\,6}&{\,\,0}&{ - 3}&{ - 6}&0\end{array}} \right)\)

Question: 14. Exercises 12–14 concern an \(m \times n\) matrix \(A\) with a reduced singular value decomposition, \(A = {U_r}D{V_r}^T\), and the pseudoinverse \({A^ + } = {U_r}{D^{ - 1}}{V_r}^T\).

Given any \({\rm{b}}\) in \({\mathbb{R}^m}\), adapt Exercise 13 to show that \({A^ + }{\rm{b}}\) is the least-squares solution of minimum length. [Hint: Consider the equation \(A{\rm{x}} = {\rm{b}}\), where \(\mathop {\rm{b}}\limits^\^ \) is the orthogonal projection of \({\rm{b}}\) onto Col \(A\).

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free