Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Show that if \(\lambda_{1}\) and \(\lambda_{2}\) are eigenvalues of a Hermitian matrix \(\mathbf{A},\) and if \(\lambda_{1} \neq \lambda_{2},\) then the corresponding eigenvectors \(\mathbf{x}^{(1)}\) and \(\mathbf{x}^{(2)}\) are orthogonal. Hint: Use the results of Problems 31 and 32 to show that \(\left(\lambda_{1}-\lambda_{2}\right)\left(\mathbf{x}^{(1)}, \mathbf{x}^{(1)}\right)=0\)

Short Answer

Expert verified
Question: Show that the eigenvectors corresponding to distinct eigenvalues of a Hermitian matrix are orthogonal. Answer: To prove that the eigenvectors corresponding to distinct eigenvalues of a Hermitian matrix are orthogonal, we showed that the inner product \((\mathbf{x}^{(1)}, \mathbf{x}^{(2)}) = 0\) for eigenvectors \(\mathbf{x}^{(1)}\) and \(\mathbf{x}^{(2)}\) with distinct eigenvalues \(\lambda_{1}\) and \(\lambda_{2}\). We used properties of Hermitian matrices, such as \(\mathbf{A} = \mathbf{A^{\dagger}}\) and the fact that the eigenvalues are real, to establish this result.

Step by step solution

01

Compute A times eigenvectors

For each eigenvector, we can use the property that for a matrix \(\mathbf{A}\), its eigenvalue \(\lambda\) and corresponding eigenvector \(\mathbf{x}\) satisfy \(\mathbf{A}\mathbf{x} = \lambda\mathbf{x}\). Therefore: $$\mathbf{A}\mathbf{x}^{(1)} = \lambda_{1}\mathbf{x}^{(1)}$$ $$\mathbf{A}\mathbf{x}^{(2)} = \lambda_{2}\mathbf{x}^{(2)}$$
02

Create equation using inner product

Compute the inner product \((\mathbf{x}^{(1)}, \mathbf{A}\mathbf{x}^{(2)})\) and the inner product \((\mathbf{A}\mathbf{x}^{(1)}, \mathbf{x}^{(2)})\), so that we eventually get an equation involving \((\mathbf{x}^{(1)}, \mathbf{x}^{(2)})\). To do so, we'll use the results from Step 1: $$(\mathbf{x}^{(1)}, \mathbf{A}\mathbf{x}^{(2)}) = (\mathbf{x}^{(1)}, \lambda_{2}\mathbf{x}^{(2)})$$ $$(\mathbf{A}\mathbf{x}^{(1)}, \mathbf{x}^{(2)}) = (\lambda_{1}\mathbf{x}^{(1)}, \mathbf{x}^{(2)})$$
03

Use Hermitian property

Use the property that \(\mathbf{A} = \mathbf{A^{\dagger}}\) to rewrite the second inner product. Note that the conjugate transpose of a product of two matrices is the product of the conjugate transposes in reversed order: $$\left(\mathbf{A}\mathbf{x}^{(1)}, \mathbf{x}^{(2)}\right) = \left(\mathbf{x}^{(1)}\right)^{\dagger}\mathbf{A^{\dagger}}\mathbf{x}^{(2)} = \left(\mathbf{x}^{(1)}\right)^{\dagger}\mathbf{A}\mathbf{x}^{(2)}$$
04

equating Step 2

From Step 2, we have: $$\left(\mathbf{x}^{(1)}, \lambda_{2}\mathbf{x}^{(2)}\right) = \left(\mathbf{x}^{(1)}\right)^{\dagger}\mathbf{A}\mathbf{x}^{(2)}$$ $$\left(\lambda_{1}\mathbf{x}^{(1)}, \mathbf{x}^{(2)}\right) = \left(\mathbf{x}^{(1)}\right)^{\dagger}\mathbf{A}\mathbf{x}^{(2)}$$ Now equate these expressions: $$\left(\mathbf{x}^{(1)}, \lambda_{2}\mathbf{x}^{(2)}\right) = \left(\lambda_{1}\mathbf{x}^{(1)}, \mathbf{x}^{(2)}\right)$$
05

Rearrange and simplify

Rearrange the equation from Step 4 and simplify by taking out the common factor \((\mathbf{x}^{(1)}, \mathbf{x}^{(2)})\): $$(\lambda_{1} - \lambda_{2})\left(\mathbf{x}^{(1)}, \mathbf{x}^{(2)}\right) = 0$$
06

Conclusion

Since we know that \(\lambda_{1} \neq \lambda_{2}\), the term \((\lambda_{1} - \lambda_{2})\) is non-zero by definition. The only way for the equation from Step 5 to hold true is if the inner product \((\mathbf{x}^{(1)}, \mathbf{x}^{(2)}) = 0\). Therefore, the eigenvectors \(\mathbf{x}^{(1)}\) and \(\mathbf{x}^{(2)}\) corresponding to distinct eigenvalues \(\lambda_{1}\) and \(\lambda_{2}\) of a Hermitian matrix \(\mathbf{A}\) are orthogonal.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

Key Concepts

These are the key concepts you need to understand to accurately answer the question.

Eigenvectors Orthogonality
Understanding the orthogonality of eigenvectors associated with a Hermitian matrix is a fundamental concept in linear algebra. When we speak of eigenvectors being orthogonal, it means that their inner product is zero. This particular feature holds true for eigenvectors corresponding to distinct eigenvalues of a Hermitian matrix.

Let's delve into why this happens. Given a Hermitian matrix, denoted by \( \mathbf{A} \), we have two eigenvalues, \( \lambda_{1} \) and \( \lambda_{2} \) with their associated eigenvectors, \( \mathbf{x}^{(1)} \) and \( \mathbf{x}^{(2)} \). If \( \lambda_{1} \) is not equal to \( \lambda_{2} \) and supposing that \( \mathbf{A} \) is indeed Hermitian—meaning \( \mathbf{A} = \mathbf{A}^\dagger \), where the dagger denotes the conjugate transpose—it can be shown through a series of inner product manipulations that the eigenvectors must be orthogonal.

The proof essentially reduces down to demonstrating that \( (\mathbf{x}^{(1)}, \mathbf{x}^{(2)}) = 0 \), which signifies orthogonality. Since the inner product also conforms to the properties of the conjugate symmetry and linearity, this proof is facilitated by those inner product properties. The orthogonality of eigenvectors is particularly valuable because it ensures that they can form a basis for the space in which they reside that simplifies the matrix representation and its analysis.
Eigenvalue Equation
The eigenvalue equation is central to the study of linear transformations and matrices. It is written as \( \mathbf{A}\mathbf{x} = \lambda\mathbf{x} \), where \( \mathbf{A} \) is a square matrix, \( \mathbf{x} \) is an eigenvector, and \( \lambda \) is the corresponding eigenvalue. This equation states that when a matrix acts on an eigenvector, the result is simply a scalar multiple of that eigenvector.

For Hermitian matrices, the eigenvalues are always real numbers, which is a pivotal characteristic since it implies stable physical systems when applied in fields like quantum mechanics and vibrations analysis. In the context of our example, the eigenvalue equation is exploited to show that two eigenvectors are orthogonal by manipulating the equation with the eigenvalues and taking advantage of the inner product's properties to reveal that the only scalar factor that satisfies the final equation is zero, given that the eigenvalues are distinct.

This result is profound; it simplifies many problems in applied mathematics, particularly those involving diagonalization of a matrix, which is the process of finding a matrix's eigenvalues and eigenvectors to convert it into a much simpler form.
Inner Product Properties
The inner product, often denoted as \( (\mathbf{u}, \mathbf{v}) \), is a crucial mathematical tool in vector spaces which provides a way to define geometric concepts like length and angle. In complex spaces, which include our Hermitian matrix scenario, the inner product is also conjugate symmetric, meaning that \( (\mathbf{u}, \mathbf{v}) = \overline{(\mathbf{v}, \mathbf{u})} \), where the bar denotes complex conjugation. Additionally, it is linear in its first argument and antilinear in the second, allowing scalar factors to be moved in and out of the product depending on their position.

This comes into play when we manipulate the eigenvalue equation to show the orthogonality of eigenvectors. By applying the properties of conjugate symmetry and linearity, we can take inner products involving eigenvectors and their associated eigenvalues to arrive at an equation that specifies the orthogonality condition. These properties ensure that our mathematics is consistent with the physical and geometric interpretations of vectors and their transformations under matrices like a Hermitian matrix.

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Let $$ \mathbf{J}=\left(\begin{array}{ccc}{\lambda} & {1} & {0} \\ {0} & {\lambda} & {1} \\ {0} & {0} & {\lambda}\end{array}\right) $$ where \(\lambda\) is an arbitrary real number. (a) Find \(\mathbf{J}^{2}, \mathbf{J}^{3},\) and \(\mathbf{J}^{4}\). (b) Use an inductive argument to show that $$ \mathbf{J}^{n}=\left(\begin{array}{ccc}{\lambda^{n}} & {n \lambda^{n-1}} & {[n(n-1) / 2] \lambda^{n-2}} \\ {0} & {\lambda^{n}} & {n \lambda^{n-1}} \\\ {0} & {0} & {\lambda^{n}}\end{array}\right) $$ (c) Determine exp(Jt). (d) Observe that if you choose \(\lambda=2\), then the matrix \(\mathbf{J}\) in this problem is the same as the matrix \(\mathbf{J}\) in Problem \(17(f)\). Using the matrix T from Problem \(17(f),\) form the product Texp(Jt) with \(\lambda=2\). Observe that the resulting matrix is the same as the fundamental matrix \(\Psi(t)\) in Problem \(17(e) .\)

Find all eigenvalues and eigenvectors of the given matrix. $$ \left(\begin{array}{ccc}{11 / 9} & {-2 / 9} & {8 / 9} \\ {-2 / 9} & {2 / 9} & {10 / 9} \\ {8 / 9} & {10 / 9} & {5 / 9}\end{array}\right) $$

Consider the equation $$ a y^{\prime \prime}+b y^{\prime}+c y=0 $$ $$ \begin{array}{l}{\text { where } a, b, \text { and } c \text { are constants. In Chapter } 3 \text { it was shown that the general solution depended }} \\\ {\text { on the roots of the characteristic equation }}\end{array} $$ $$ a r^{2}+b r+c=0 $$ $$ \begin{array}{l}{\text { (a) Transform Eq. (i) into a system of first order equations by letting } x_{1}=y, x_{2}=y^{\prime} . \text { Find }} \\ {\text { the system of equations } x^{\prime}=A x \text { satisfied by } x=\left(\begin{array}{l}{x_{1}} \\ {x_{2}} \\ {x_{2}}\end{array}\right)} \\\ {\text { (b) Find the equation that determines the eigenvalues of the coefficient matrix } \mathbf{A} \text { in part (a). }} \\ {\text { Note that this equation is just the characteristic equation (ii) of Eq. (i). }}\end{array} $$

In this problem we indicate how to show that \(\mathbf{u}(t)\) and \(\mathbf{v}(t)\), as given by Eqs. (9), are linearly independent. Let \(r_{1}=\lambda+i \mu\) and \(\bar{r}_{1}=\lambda-i \mu\) be a pair of conjugate eigenvalues of the coefficient matrix \(\mathbf{A}\) of \(\mathrm{Fq}(1)\); let \(\xi^{(1)}=\mathbf{a}+i \mathbf{b}\) and \(\bar{\xi}^{(1)}=\mathbf{a}-i \mathbf{b}\) be the corresponding eigenvectors. Recall that it was stated in Section 7.3 that if \(r_{1} \neq \bar{r}_{1},\) then \(\boldsymbol{\xi}^{(1)}\) and \(\bar{\xi}^{(1)}\) are linearly independent. (a) First we show that a and b are linearly independent. Consider the equation \(c_{1} \mathrm{a}+\) \(c_{2} \mathrm{b}=0 .\) Express a and \(\mathrm{b}\) in terms of \(\xi^{(1)}\) and \(\bar{\xi}^{(1)},\) and then show that \(\left(c_{1}-i c_{2}\right) \xi^{(1)}+\) \(\left(c_{1}+i c_{2}\right) \bar{\xi}^{(1)}=0\) (b) Show that \(c_{1}-i c_{2}=0\) and \(c_{1}+i c_{2}=0\) and then that \(c_{1}=0\) and \(c_{2}=0 .\) Consequently, a and b are linearly independent. (c) To show that \(\mathbf{u}(t)\) and \(\mathbf{v}(t)\) are linearly independent consider the equation \(c_{1} \mathbf{u}\left(t_{0}\right)+\) \(c_{2} \mathbf{v}\left(t_{0}\right)=\mathbf{0}\), where \(t_{0}\) is an arbitrary point. Rewrite this equation in terms of a and \(\mathbf{b}\), and then proceed as in part (b) to show that \(c_{1}=0\) and \(c_{2}=0 .\) Hence \(\mathbf{u}(t)\) and \(\mathbf{v}(t)\) are linearly independent at the arbitrary point \(t_{0}\). Therefore they are linearly independent at every point and on every interval.

Consider a \(2 \times 2\) system \(\mathbf{x}^{\prime}=\mathbf{A} \mathbf{x}\). If we assume that \(r_{1} \neq r_{2}\), the general solution is \(\mathbf{x}=c_{1} \xi^{(1)} e^{t_{1}^{\prime}}+c_{2} \xi^{(2)} e^{\prime 2},\) provided that \(\xi^{(1)}\) and \(\xi^{(2)}\) are linearly independent In this problem we establish the linear independence of \(\xi^{(1)}\) and \(\xi^{(2)}\) by assuming that they are linearly dependent, and then showing that this leads to a contradiction. $$ \begin{array}{l}{\text { (a) Note that } \xi \text { (i) satisfies the matrix equation }\left(\mathbf{A}-r_{1} \mathbf{I}\right) \xi^{(1)}=\mathbf{0} ; \text { similarly, note that }} \\ {\left(\mathbf{A}-r_{2} \mathbf{I}\right) \xi^{(2)}=\mathbf{0}} \\ {\text { (b) Show that }\left(\mathbf{A}-r_{2} \mathbf{I}\right) \xi^{(1)}=\left(r_{1}-r_{2}\right) \mathbf{\xi}^{(1)}} \\\ {\text { (c) Suppose that } \xi^{(1)} \text { and } \xi^{(2)} \text { are linearly dependent. Then } c_{1} \xi^{(1)}+c_{2} \xi^{(2)}=\mathbf{0} \text { and at least }}\end{array} $$ $$ \begin{array}{l}{\text { one of } c_{1} \text { and } c_{2} \text { is not zero; suppose that } c_{1} \neq 0 . \text { Show that }\left(\mathbf{A}-r_{2} \mathbf{I}\right)\left(c_{1} \boldsymbol{\xi}^{(1)}+c_{2} \boldsymbol{\xi}^{(2)}\right)=\mathbf{0}} \\ {\text { and also show that }\left(\mathbf{A}-r_{2} \mathbf{I}\right)\left(c_{1} \boldsymbol{\xi}^{(1)}+c_{2} \boldsymbol{\xi}^{(2)}\right)=c_{1}\left(r_{1}-r_{2}\right) \boldsymbol{\xi}^{(1)} \text { . Hence } c_{1}=0, \text { which is }} \\\ {\text { a contradiction. Therefore } \xi^{(1)} \text { and } \boldsymbol{\xi}^{(2)} \text { are linearly independent. }}\end{array} $$ $$ \begin{array}{l}{\text { (d) Modify the argument of part (c) in case } c_{1} \text { is zero but } c_{2} \text { is not. }} \\ {\text { (e) Carry out a similar argument for the case in which the order } n \text { is equal to } 3 \text { ; note that }} \\ {\text { the procedure can be extended to cover an arbitrary value of } n .}\end{array} $$

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free