Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Exercise 3-8 refer to \({{\bf{P}}_{\bf{2}}}\) with the inner product given by evaluation at \( - {\bf{1}}\), 0, and 1. (See Example 2).

7. Compute the orthogonal projection of q onto the subspace spanned by p, for p and q in Exercise 3.

Short Answer

Expert verified

The orthogonal projection is \(\frac{{56}}{{25}} + \frac{{14}}{{25}}t\).

Step by step solution

01

Write the results from Exercise 3

\(\begin{align*}p\left( { - 1} \right) &= 4 - 1\\ &= 3\end{align*}\)

\(\begin{align*}p\left( 0 \right) &= 4 + 0\\ &= 4\end{align*}\)

\(\begin{align*}p\left( 1 \right) &= 4 + 1\\ &= 5\end{align*}\)

And,

\(\begin{align*}q\left( { - 1} \right) &= 5 - 4{\left( { - 1} \right)^2}\\ &= 5 - 4\\ &= 1\end{align*}\)

\(\begin{align*}q\left( 0 \right) &= 5 - 4{\left( 0 \right)^2}\\ &= 5\end{align*}\)

\(\begin{align*}q\left( 1 \right) &= 5 - 4{\left( 1 \right)^2}\\ &= 5 - 4\\ &= 1\end{align*}\)

02

Find the inner product of q and p

The inner product \(\left\langle {q,p} \right\rangle \) can be calculated as follows:

\(\begin{align*}\left\langle {q,p} \right\rangle & = \left\langle {p,q} \right\rangle \\ &= p\left( { - 1} \right)q\left( { - 1} \right) + p\left( 0 \right)q\left( 0 \right) + p\left( 1 \right)q\left( 1 \right)\\ &= \left( 3 \right)\left( 1 \right) + \left( 4 \right)\left( 5 \right) + \left( 5 \right)\left( 1 \right)\\ &= 28\end{align*}\)

03

Find the inner product of p and p 

The inner product \(\left\langle {p,p} \right\rangle \) can be calcaulted as follows:

\(\begin{align*}\left\langle {p,p} \right\rangle &= p\left( { - 1} \right)p\left( { - 1} \right) + p\left( 0 \right)p\left( 0 \right) + p\left( 1 \right)p\left( 1 \right)\\ &= \left( 3 \right)\left( 3 \right) + \left( 4 \right)\left( 4 \right) + \left( 5 \right)\left( 5 \right)\\ &= 9 + 16 + 28\\ &= 50\end{align*}\)

04

Find the orthogonal projection of q onto subspace spanned by p

The orthogonal projection can be calculated as follows:

\[\begin{align*}\hat q &= \frac{{\left\langle {q,p} \right\rangle }}{{\left\langle {p,p} \right\rangle }}p\\ &= \frac{{28}}{{50}}\left( {4 + t} \right)\\&= \frac{{56}}{{25}} + \frac{{14}}{{25}}t\end{align*}\]

Thus, the orthogonal projection is \(\frac{{56}}{{25}} + \frac{{14}}{{25}}t\).

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

In Exercises 13 and 14, find the best approximation to\[{\bf{z}}\]by vectors of the form\[{c_1}{{\bf{v}}_1} + {c_2}{{\bf{v}}_2}\].

13.\[z = \left[ {\begin{aligned}3\\{ - 7}\\2\\3\end{aligned}} \right]\],\[{{\bf{v}}_1} = \left[ {\begin{aligned}2\\{ - 1}\\{ - 3}\\1\end{aligned}} \right]\],\[{{\bf{v}}_2} = \left[ {\begin{aligned}1\\1\\0\\{ - 1}\end{aligned}} \right]\]

Compute the least-squares error associated with the least square solution found in Exercise 4.

Let \(U\) be an \(n \times n\) orthogonal matrix. Show that if \(\left\{ {{{\bf{v}}_1}, \ldots ,{{\bf{v}}_n}} \right\}\) is an orthonormal basis for \({\mathbb{R}^n}\), then so is \(\left\{ {U{{\bf{v}}_1}, \ldots ,U{{\bf{v}}_n}} \right\}\).

Given \(A = QR\) as in Theorem 12, describe how to find an orthogonal\(m \times m\)(square) matrix \({Q_1}\) and an invertible \(n \times n\) upper triangular matrix \(R\) such that

\(A = {Q_1}\left[ {\begin{aligned}{{}{}}R\\0\end{aligned}} \right]\)

The MATLAB qr command supplies this โ€œfullโ€ QR factorization

when rank \(A = n\).

Given data for a least-squares problem, \(\left( {{x_1},{y_1}} \right), \ldots ,\left( {{x_n},{y_n}} \right)\), the following abbreviations are helpful:

\(\begin{aligned}{l}\sum x = \sum\nolimits_{i = 1}^n {{x_i}} ,{\rm{ }}\sum {{x^2}} = \sum\nolimits_{i = 1}^n {x_i^2} ,\\\sum y = \sum\nolimits_{i = 1}^n {{y_i}} ,{\rm{ }}\sum {xy} = \sum\nolimits_{i = 1}^n {{x_i}{y_i}} \end{aligned}\)

The normal equations for a least-squares line \(y = {\hat \beta _0} + {\hat \beta _1}x\)may be written in the form

\(\begin{aligned}{{\hat \beta }_0} + {{\hat \beta }_1}\sum x = \sum y \\{{\hat \beta }_0}\sum x + {{\hat \beta }_1}\sum {{x^2}} = \sum {xy} {\rm{ (7)}}\end{aligned}\)

16. Use a matrix inverse to solve the system of equations in (7) and thereby obtain formulas for \({\hat \beta _0}\) , and that appear in many statistics texts.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free