Warning: foreach() argument must be of type array|object, bool given in /var/www/html/web/app/themes/studypress-core-theme/template-parts/header/mobile-offcanvas.php on line 20

Readers may have encountered the multiple regression model in a previous course in statistics. We can briefly write it as follows. Suppose we have a vector of \(n\) observations \(\mathbf{Y}\) which has the distribution \(N_{n}\left(\mathbf{X} \boldsymbol{\beta}, \sigma^{2} \mathbf{I}\right)\), where \(\mathbf{X}\) is an \(n \times p\) matrix of known values, which has full column rank \(p\), and \(\beta\) is a \(p \times 1\) vector of unknown parameters. The least squares estimator of \(\boldsymbol{\beta}\) is $$ \widehat{\boldsymbol{\beta}}=\left(\mathbf{X}^{\prime} \mathbf{X}\right)^{-1} \mathbf{X}^{\prime} \mathbf{Y} $$ (a) Determine the distribution of \(\widehat{\boldsymbol{\beta}}\). (b) Let \(\hat{\mathbf{Y}}=\mathbf{X} \widehat{\boldsymbol{\beta}}\). Determine the distribution of \(\widehat{\mathbf{Y}}\). (c) Let \(\widehat{\mathbf{e}}=\mathbf{Y}-\hat{\mathbf{Y}} .\) Determine the distribution of \(\widehat{\mathbf{e}}\). (d) By writing the random vector \(\left(\widehat{\mathbf{Y}}^{\prime}, \widehat{\mathbf{e}}^{\prime}\right)^{\prime}\) as a linear function of \(\mathbf{Y}\), show that the random vectors \(\hat{\mathbf{Y}}\) and \(\widehat{\mathbf{e}}\) are independent. (e) Show that \(\widehat{\beta}\) solves the least squares problem; that is, $$ \|\mathbf{Y}-\mathbf{X} \widehat{\boldsymbol{\beta}}\|^{2}=\min _{\mathbf{b} \in R^{p}}\|\mathbf{Y}-\mathbf{X} \mathbf{b}\|^{2} $$

Short Answer

Expert verified
The distribution of \(\widehat{\boldsymbol{\beta}}\) is \(N (\boldsymbol{\beta}, \sigma^2(\mathbf{X}^\prime \mathbf{X})^{-1})\), of \(\widehat{\mathbf{Y}}\) is \(N (\mathbf{X\boldsymbol{\beta}}, \sigma^2\mathbf{X}(\mathbf{X}^\prime \mathbf{X})^{-1}\mathbf{X}')\), and of \(\widehat{\mathbf{e}}\) is \(N(0, \sigma^2(\mathbf{I}-\mathbf{X}(\mathbf{X}^\prime \mathbf{X})^{-1}\mathbf{X}^\prime))\). Both \(\hat{\mathbf{Y}}\) and \(\widehat{\mathbf{e}}\) are independent. Furthermore, \(\widehat{\beta}\) does solve the least squares problem.

Step by step solution

01

Determine the Distribution of \(\widehat{\boldsymbol{\beta}}\)

As we have the distribution of \(\mathbf{Y}\) as \(N_{n}\left(\mathbf{X} \boldsymbol{\beta}, \sigma^{2} \mathbf{I}\right)\), we can write that \(\boldsymbol{\beta}=\mathbf{X}^{\prime} \mathbf{X}^{-1}\mathbf{X}^{\prime} \mathbf{Y}\). Substituting in for Y, we find the distribution of \(\widehat{\boldsymbol{\beta}}\) to be \(N (\boldsymbol{\beta}, \sigma^2(\mathbf{X}^\prime \mathbf{X})^{-1})\). The proof involves understanding the properties of normal distributions, particularly involving multiplication by a constant or a deterministic matrix.
02

Determine the Distribution of \(\widehat{\mathbf{Y}}\)

Given \(\hat{\mathbf{Y}}=\mathbf{X} \widehat{\boldsymbol{\beta}}\), we insert \(\widehat{\boldsymbol{\beta}}\) into the equation and apply the properties of expected value and variance. Because both \(\mathbf{X}\) and \(\sigma^{2}\) are non-random, we find \(\hat{\mathbf{Y}}\) to be a linear transformation of a normally distributed variable and thus its distribution is \(N (\mathbf{X\boldsymbol{\beta}}, \sigma^2\mathbf{X}(\mathbf{X}^\prime \mathbf{X})^{-1}\mathbf{X}')\).
03

Determine the Distribution of \(\widehat{\mathbf{e}}\)

We have \(\widehat{\mathbf{e}}=\mathbf{Y}-\hat{\mathbf{Y}}\). Substituting the respective values and following similar steps to before, we see that \(\widehat{\mathbf{e}}\) is distributed as \(N(0, \sigma^2(\mathbf{I}-\mathbf{X}(\mathbf{X}^\prime \mathbf{X})^{-1}\mathbf{X}^\prime))\).
04

Show Independence of \(\hat{\mathbf{Y}}\) and \(\widehat{\mathbf{e}}\)

Firstly, we express \(\hat{\mathbf{Y}}\) and \(\widehat{\mathbf{e}}\) as linear functions of \(\mathbf{Y}\). Then, we find the covariance between the two. After simplifications, the covariance is zero, implying independence because both \(\hat{\mathbf{Y}}\) and \(\widehat{\mathbf{e}}\) are normal.
05

Show \(\widehat{\beta}\) Solves the Least Squares Problem

The least squares problem aims to minimize the residual sum of squares (RSS), denoted as \(\|\mathbf{Y}-\mathbf{X} \mathbf{b}\|^{2}\). If we take the derivative of RSS with respect to b, set it to zero, and solve for b, we can prove \(\widehat{\beta}\) is the solution to the least squares problem.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with Vaia!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.

Sign-up for free