Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Let \(Q_{1}\) and \(Q_{2}\) be two nonnegative quadratic forms in the observations of a random sample from a distribution which is \(N\left(0, \sigma^{2}\right) .\) Show that another quadratic form \(Q\) is independent of \(Q_{1}+Q_{2}\) if and only if \(Q\) is independent of each of \(Q_{1}\) and \(Q_{2}\) Hint: \(\quad\) Consider the orthogonal transformation that diagonalizes the matrix of \(Q_{1}+Q_{2}\). After this transformation, what are the forms of the matrices \(Q, Q_{1}\) and \(Q_{2}\) if \(Q\) and \(Q_{1}+Q_{2}\) are independent?

Short Answer

Expert verified
The quadratic form \(Q\) is independent of each of \(Q_1\) and \(Q_2\) if and only if each linear form from \(Q\) is orthogonal to every linear form from \(Q_1\) and \(Q_2\). This conclusion is made clearer through the use of orthogonal transformation to diagonalize the matrix of \(Q_1+Q_2\), thus simplifying the task.

Step by step solution

01

Understand the independent quadratic forms

Quadratic forms are independent if and only if every linear form from one is orthogonal to every linear form from the second. Orthogonality in this case will refer to zero covariance. Therefore, to show that the quadratic form \(Q\) is independent of each \(Q_1\) and \(Q_2\), it must be shown that each linear form from \(Q\) is orthogonal to every linear form from \(Q_1\) and \(Q_2\).
02

Diagonalization Using Orthogonal transformation

Diagonalization using orthogonal transformation allows for simplification of algebraic structures by converting them into a canonical form. Apply an orthogonal transformation that will diagonalize the matrix of \(Q_1+Q_2\). The new forms of the matrices \(Q\), \(Q_1\) and \(Q_2\) will depend on whether they are independent of \(Q_1+Q_2\). The transformed forms will make clearer the interrelations among \(Q\), \(Q_1\) and \(Q_2\).
03

Conclude on the independence from the transformed forms

After the transformation, observe the transformed forms of the matrices. If the quadratic form \(Q\) is independent of \(Q_1+Q_2\), then the transformed form of \(Q\) should be orthogonal to linear forms obtained from \(Q_1\) and \(Q_2\). Now, it can be concluded that \(Q\) is independent of each of \(Q_1\) and \(Q_2\).

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Assume that the sample \(\left(x_{1}, Y_{1}\right), \ldots,\left(x_{n}, Y_{n}\right)\) follows the linear model \((9.6 .1)\). Suppose \(Y_{0}\) is a future observation at \(x=x_{0}-\bar{x}\) and we want to determine a predictive interval for it. Assume that the model \((9.6 .1)\) holds for \(Y_{0}\); i.e., \(Y_{0}\) has a \(N\left(\alpha+\beta\left(x_{0}-\bar{x}\right), \sigma^{2}\right)\) distribution. We will use \(\hat{\eta}_{0}\) of Exercise \(9.6 .4\) as our prediction of \(Y_{0}\) (a) Obtain the distribution of \(Y_{0}-\hat{\eta}_{0}\). Use the fact that the future observation \(Y_{0}\) is independent of the sample \(\left(x_{1}, Y_{1}\right), \ldots,\left(x_{n}, Y_{n}\right)\) (b) Determine a \(t\) -statistic with numerator \(Y_{0}-\hat{\eta}_{0}\). (c) Now beginning with \(1-\alpha=P\left[-t_{\alpha / 2, n-2}

Let \(\boldsymbol{X}^{\prime}=\left[X_{1}, X_{2}, \ldots, X_{n}\right]\), where \(X_{1}, X_{2}, \ldots, X_{n}\) are observations of a random sample from a distribution which is \(N\left(0, \sigma^{2}\right) .\) Let \(b^{\prime}=\left[b_{1}, b_{2}, \ldots, b_{n}\right]\) be a real nonzero vector, and let \(\boldsymbol{A}\) be a real symmetric matrix of order \(n\). Prove that the linear form \(\boldsymbol{b}^{\prime} \boldsymbol{X}\) and the quadratic form \(\boldsymbol{X}^{\prime} \boldsymbol{A} \boldsymbol{X}\) are independent if and only if \(\boldsymbol{b}^{\prime} \boldsymbol{A}=\mathbf{0}\). Use this fact to prove that \(\boldsymbol{b}^{\prime} \boldsymbol{X}\) and \(\boldsymbol{X}^{\prime} \boldsymbol{A} \boldsymbol{X}\) are independent if and only if the two quadratic forms, \(\left(\boldsymbol{b}^{\prime} \boldsymbol{X}\right)^{2}=\boldsymbol{X}^{\prime} \boldsymbol{b} \boldsymbol{b}^{\prime} \boldsymbol{X}\) and \(\boldsymbol{X}^{\prime} \boldsymbol{A} \boldsymbol{X}\) are independent.

Fit by the method of least squares the plane \(z=a+b x+c y\) to the five points \((x, y, z):(-1,-2,5),(0,-2,4),(0,0,4),(1,0,2),(2,1,0)\).

Show that the square of a noncentral \(T\) random variable is a noncentral \(F\) random variable.

Two experiments gave the following results: $$ \begin{array}{cccccc} \hline \mathrm{n} & \bar{x} & \bar{y} & s_{x} & s_{y} & \mathrm{r} \\ \hline 100 & 10 & 20 & 5 & 8 & 0.70 \\ 200 & 12 & 22 & 6 & 10 & 0.80 \\ \hline \end{array} $$ Calculate \(r\) for the combined sample.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free