Chapter 9: Problem 12
Fit \(y=a+x\) to the data $$ \begin{array}{l|lll} \mathrm{x} & 0 & 1 & 2 \\ \hline \mathrm{y} & 1 & 3 & 4 \end{array} $$ by the method of least squares.
Chapter 9: Problem 12
Fit \(y=a+x\) to the data $$ \begin{array}{l|lll} \mathrm{x} & 0 & 1 & 2 \\ \hline \mathrm{y} & 1 & 3 & 4 \end{array} $$ by the method of least squares.
All the tools & learning materials you need for study success - in one app.
Get started for freeThe driver of a diesel-powered automobile decided to test the quality of three types of diesel fuel sold in the area based on mpg. Test the null hypothesis that the three means are equal using the following data. Make the usual assumptions and take \(\alpha=0.05\). $$ \begin{array}{llllll} \text { Brand A: } & 38.7 & 39.2 & 40.1 & 38.9 & \\ \text { Brand B: } & 41.9 & 42.3 & 41.3 & & \\ \text { Brand C: } & 40.8 & 41.2 & 39.5 & 38.9 & 40.3 \end{array} $$
Let \(A\) be the real symmetric matrix of a quadratic form \(Q\) in the observations of a random sample of size \(n\) from a distribution which is \(N\left(0, \sigma^{2}\right)\). Given that \(Q\) and the mean \(\bar{X}\) of the sample are independent, what can be said of the elements of each row (column) of \(\boldsymbol{A}\) ? Hint: Are \(Q\) and \(X^{2}\) independent?
Let \(X_{1}\) and \(X_{2}\) be two independent random variables. Let \(X_{1}\) and
\(Y=\) \(X_{1}+X_{2}\) be \(\chi^{2}\left(r_{1}, \theta_{1}\right)\) and
\(\chi^{2}(r, \theta)\), respectively. Here \(r_{1}
Let \(\boldsymbol{A}_{1}, \boldsymbol{A}_{2}, \ldots, \boldsymbol{A}_{k}\) be the matrices of \(k>2\) quadratic forms \(Q_{1}, Q_{2}, \ldots, Q_{k}\) in the observations of a random sample of size \(n\) from a distribution which is \(N\left(0, \sigma^{2}\right)\). Prove that the pairwise independence of these forms implies that they are mutually independent. Hint: Show that \(A_{i} A_{j}=0, i \neq j\), permits \(E\left[\exp \left(t_{1} Q_{1}+t_{2} Q_{2}+\cdots t_{k} Q_{k}\right)\right]\) to be written as a product of the mgfs of \(Q_{1}, Q_{2}, \ldots, Q_{k}\)
Show that \(\sum_{j=1}^{b} \sum_{i=1}^{a}\left(X_{i j}-\bar{X}_{i .}\right)^{2}=\sum_{j=1}^{b} \sum_{i=1}^{a}\left(X_{i j}-\bar{X}_{i}-\bar{X}_{. j}+\bar{X}_{. .}\right)^{2}+a \sum_{j=1}^{b}\left(\bar{X}_{. j}-\bar{X}_{. .}\right)^{2} .\)
What do you think about this solution?
We value your feedback to improve our textbook solutions.