Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Prove Theorem 2(b) and 2(c). Use the row-column rule. The \(\left( {i,j} \right)\)- entry in \(A\left( {B + C} \right)\) can be written as \({a_{i1}}\left( {{b_{1j}} + {c_{1j}}} \right) + ... + {a_{in}}\left( {{b_{nj}} + {c_{nj}}} \right)\) or \(\sum\limits_{k = 1}^n {{a_{ik}}\left( {{b_{kj}} + {c_{kj}}} \right)} \).

Short Answer

Expert verified

Theorem 2(b) and theorem 2(c) are proved.

Step by step solution

01

The row-column rule

If the product AB is defined, the entry in row \(i\) and column \(j\) of ABis the sum of the products of corresponding entries from the row \(i\)of Aand column \(j\) of B. If \({\left( {AB} \right)_{ij}}\) denotes the \(\left( {i,j} \right)\)- entry in AB, and if Ais a \(m \times n\) matrix, then

\({\left( {AB} \right)_{ij}} = {a_{i1}}{b_{1j}} + {a_{i2}}{b_{2j}} + ... + {a_{in}}{b_{nj}}\).

02

Prove theorem 2(b)

Theorem 2states that Abe a \(m \times n\) matrix let Band Chave sizes for which the indicated sums and products are defined.

  1. \(A\left( {BC} \right) = \left( {AB} \right)C\) (associative law of multiplication)
  2. \(A\left( {B + C} \right) = AB + AC\) (left distributive law)
  3. \(\left( {B + C} \right)A = BA + CA\) (right distributive law)

The \(\left( {i,j} \right)\)- entry in \(A\left( {B + C} \right)\) can be written as \({a_{i1}}\left( {{b_{1j}} + {c_{1j}}} \right) + ... + {a_{in}}\left( {{b_{nj}} + {c_{nj}}} \right)\).

The \(\left( {i,j} \right)\)- entry of \(A\left( {B + C} \right)\) equals to the \(\left( {i,j} \right)\)- entry of \(AB + AC\) since \(\sum\limits_{k = 1}^n {{a_{ik}}\left( {{b_{kj}} + {c_{kj}}} \right)} = \sum\limits_{k = 1}^n {{a_{ik}}{b_{kj}}} + \sum\limits_{k = 1}^n {{a_{ik}}{c_{kj}}} \).

03

Prove theorem 2(c)

The \(\left( {i,j} \right)\)- entry of \(\left( {B + C} \right)A\) equals to the \(\left( {i,j} \right)\)- entry of \(BA + CA\) since \(\sum\limits_{k = 1}^n {\left( {{b_{ik}} + {c_{ik}}} \right){a_{kj}}} = \sum\limits_{k = 1}^n {{b_{ik}}{a_{kj}}} + \sum\limits_{k = 1}^n {{c_{ik}}{a_{kj}}} \).

Hence, the theorems 2(b) and 2(c) are proved.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

In Exercises 27 and 28, view vectors in \({\mathbb{R}^n}\) as \(n \times 1\) matrices. For \({\mathop{\rm u}\nolimits} \) and \({\mathop{\rm v}\nolimits} \) in \({\mathbb{R}^n}\), the matrix product \({{\mathop{\rm u}\nolimits} ^T}v\) is a \(1 \times 1\) matrix, called the scalar product, or inner product, of u and v. It is usually written as a single real number without brackets. The matrix product \({{\mathop{\rm uv}\nolimits} ^T}\) is an \(n \times n\) matrix, called the outer product of u and v. The products \({{\mathop{\rm u}\nolimits} ^T}{\mathop{\rm v}\nolimits} \) and \({{\mathop{\rm uv}\nolimits} ^T}\) will appear later in the text.

27. Let \({\mathop{\rm u}\nolimits} = \left( {\begin{aligned}{*{20}{c}}{ - 2}\\3\\{ - 4}\end{aligned}} \right)\) and \({\mathop{\rm v}\nolimits} = \left( {\begin{aligned}{*{20}{c}}a\\b\\c\end{aligned}} \right)\). Compute \({{\mathop{\rm u}\nolimits} ^T}{\mathop{\rm v}\nolimits} \), \({{\mathop{\rm v}\nolimits} ^T}{\mathop{\rm u}\nolimits} \),\({{\mathop{\rm uv}\nolimits} ^T}\), and \({\mathop{\rm v}\nolimits} {{\mathop{\rm u}\nolimits} ^T}\).

Let \(A = \left( {\begin{aligned}{*{20}{c}}3&{ - 6}\\{ - 1}&2\end{aligned}} \right)\). Construct a \({\bf{2}} \times {\bf{2}}\) matrix Bsuch that ABis the zero matrix. Use two different nonzero columns for B.

When a deep space probe launched, corrections may be necessary to place the probe on a precisely calculated trajectory. Radio elementary provides a stream of vectors, \({{\bf{x}}_{\bf{1}}},....,{{\bf{x}}_k}\), giving information at different times about how the probe’s position compares with its planned trajectory. Let \({X_k}\) be the matrix \(\left[ {{x_{\bf{1}}}.....{x_k}} \right]\). The matrix \({G_k} = {X_k}X_k^T\) is computed as the radar data are analyzed. When \({x_{k + {\bf{1}}}}\) arrives, a new \({G_{k + {\bf{1}}}}\) must be computed. Since the data vector arrive at high speed, the computational burden could be serve. But partitioned matrix multiplication helps tremendously. Compute the column-row expansions of \({G_k}\) and \({G_{k + {\bf{1}}}}\) and describe what must be computed in order to update \({G_k}\) to \({G_{k + {\bf{1}}}}\).

Use partitioned matrices to prove by induction that for \(n = 2,3,...\), the \(n \times n\) matrices \(A\) shown below is invertible and \(B\) is its inverse.

\[A = \left[ {\begin{array}{*{20}{c}}1&0&0& \cdots &0\\1&1&0&{}&0\\1&1&1&{}&0\\ \vdots &{}&{}& \ddots &{}\\1&1&1& \ldots &1\end{array}} \right]\]

\[B = \left[ {\begin{array}{*{20}{c}}1&0&0& \cdots &0\\{ - 1}&1&0&{}&0\\0&{ - 1}&1&{}&0\\ \vdots &{}& \ddots & \ddots &{}\\0&{}& \ldots &{ - 1}&1\end{array}} \right]\]

For the induction step, assume A and Bare \(\left( {k + 1} \right) \times \left( {k + 1} \right)\) matrices, and partition Aand B in a form similar to that displayed in Exercises 23.

Exercises 15 and 16 concern arbitrary matrices A, B, and Cfor which the indicated sums and products are defined. Mark each statement True or False. Justify each answer.

15. a. If A and B are \({\bf{2}} \times {\bf{2}}\) with columns \({{\bf{a}}_1},{{\bf{a}}_2}\) and \({{\bf{b}}_1},{{\bf{b}}_2}\) respectively, then \(AB = \left( {\begin{aligned}{*{20}{c}}{{{\bf{a}}_1}{{\bf{b}}_1}}&{{{\bf{a}}_2}{{\bf{b}}_2}}\end{aligned}} \right)\).

b. Each column of ABis a linear combination of the columns of Busing weights from the corresponding column of A.

c. \(AB + AC = A\left( {B + C} \right)\)

d. \({A^T} + {B^T} = {\left( {A + B} \right)^T}\)

e. The transpose of a product of matrices equals the product of their transposes in the same order.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free