Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Let \(Q_{1}\) and \(Q_{2}\) be two nonnegative quadratic forms in the observations of a random sample from a distribution which is \(N\left(0, \sigma^{2}\right) .\) Show that another quadratic form \(Q\) is independent of \(Q_{1}+Q_{2}\) if and only if \(Q\) is independent of each of \(Q_{1}\) and \(Q_{2}\) Hint: \(\quad\) Consider the orthogonal transformation that diagonalizes the matrix of \(Q_{1}+Q_{2}\). After this transformation, what are the forms of the matrices \(Q, Q_{1}\) and \(Q_{2}\) if \(Q\) and \(Q_{1}+Q_{2}\) are independent?

Short Answer

Expert verified
The quadratic form \(Q\) is independent of each of \(Q_1\) and \(Q_2\) if and only if each linear form from \(Q\) is orthogonal to every linear form from \(Q_1\) and \(Q_2\). This conclusion is made clearer through the use of orthogonal transformation to diagonalize the matrix of \(Q_1+Q_2\), thus simplifying the task.

Step by step solution

01

Understand the independent quadratic forms

Quadratic forms are independent if and only if every linear form from one is orthogonal to every linear form from the second. Orthogonality in this case will refer to zero covariance. Therefore, to show that the quadratic form \(Q\) is independent of each \(Q_1\) and \(Q_2\), it must be shown that each linear form from \(Q\) is orthogonal to every linear form from \(Q_1\) and \(Q_2\).
02

Diagonalization Using Orthogonal transformation

Diagonalization using orthogonal transformation allows for simplification of algebraic structures by converting them into a canonical form. Apply an orthogonal transformation that will diagonalize the matrix of \(Q_1+Q_2\). The new forms of the matrices \(Q\), \(Q_1\) and \(Q_2\) will depend on whether they are independent of \(Q_1+Q_2\). The transformed forms will make clearer the interrelations among \(Q\), \(Q_1\) and \(Q_2\).
03

Conclude on the independence from the transformed forms

After the transformation, observe the transformed forms of the matrices. If the quadratic form \(Q\) is independent of \(Q_1+Q_2\), then the transformed form of \(Q\) should be orthogonal to linear forms obtained from \(Q_1\) and \(Q_2\). Now, it can be concluded that \(Q\) is independent of each of \(Q_1\) and \(Q_2\).

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Show that the square of a noncentral \(T\) random variable is a noncentral \(F\) random variable.

Let \(X_{1}, X_{2}, \ldots, X_{n}\) denote a random sample of size \(n\) from a distribution which is \(N\left(0, \sigma^{2}\right)\). Prove that \(\sum_{1}^{n} X_{i}^{2}\) and every quadratic form, which is nonidentically zero in \(X_{1}, X_{2}, \ldots, X_{n}\), are dependent.

Let \(Q=X_{1} X_{2}-X_{3} X_{4}\), where \(X_{1}, X_{2}, X_{3}, X_{4}\) is a random sample of size 4 from a distribution which is \(N\left(0, \sigma^{2}\right) .\) Show that \(Q / \sigma^{2}\) does not have a chi-square distribution. Find the mgf of \(Q / \sigma^{2}\).

Let the \(4 \times 1\) matrix \(\boldsymbol{Y}\) be multivariate normal \(N\left(\boldsymbol{X} \boldsymbol{\beta}, \sigma^{2} \boldsymbol{I}\right)\), where the \(4 \times 3\) matrix \(\boldsymbol{X}\) equals $$ \boldsymbol{X}=\left[\begin{array}{rrr} 1 & 1 & 2 \\ 1 & -1 & 2 \\ 1 & 0 & -3 \\ 1 & 0 & -1 \end{array}\right] $$ and \(\beta\) is the \(3 \times 1\) regression coeffient matrix. (a) Find the mean matrix and the covariance matrix of \(\hat{\boldsymbol{\beta}}=\left(\boldsymbol{X}^{\prime} \boldsymbol{X}\right)^{-1} \boldsymbol{X}^{\prime} \boldsymbol{Y}\). (b) If we observe \(\boldsymbol{Y}^{\prime}\) to be equal to \((6,1,11,3)\), compute \(\hat{\boldsymbol{\beta}}\).

Let the independent random variables \(Y_{1}, \ldots, Y_{n}\) have the joint pdf. $$ L\left(\alpha, \beta, \sigma^{2}\right)=\left(\frac{1}{2 \pi \sigma^{2}}\right)^{n / 2} \exp \left\\{-\frac{1}{2 \sigma^{2}} \sum_{1}^{n}\left[y_{i}-\alpha-\beta\left(x_{i}-\bar{x}\right)\right]^{2}\right\\} $$ where the given numbers \(x_{1}, x_{2}, \ldots, x_{n}\) are not all equal. Let \(H_{0}: \beta=0(\alpha\) and \(\sigma^{2}\) unspecified). It is desired to use a likelihood ratio test to test \(H_{0}\) against all possible alternatives. Find \(\Lambda\) and see whether the test can be based on a familiar statistic. Hint: In the notation of this section show that $$ \sum_{1}^{n}\left(Y_{i}-\hat{\alpha}\right)^{2}=Q_{3}+\widehat{\beta}^{2} \sum_{1}^{n}\left(x_{i}-\bar{x}\right)^{2} $$

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free