Chapter 9: Problem 12
Fit \(y=a+x\) to the data $$ \begin{array}{l|lll} \mathrm{x} & 0 & 1 & 2 \\ \hline \mathrm{y} & 1 & 3 & 4 \end{array} $$ by the method of least squares.
Chapter 9: Problem 12
Fit \(y=a+x\) to the data $$ \begin{array}{l|lll} \mathrm{x} & 0 & 1 & 2 \\ \hline \mathrm{y} & 1 & 3 & 4 \end{array} $$ by the method of least squares.
All the tools & learning materials you need for study success - in one app.
Get started for freeLet \(X_{1}, X_{2}, X_{3}\) be a random sample from the normal distribution \(N\left(0, \sigma^{2}\right)\). Are the quadratic forms \(X_{1}^{2}+3 X_{1} X_{2}+X_{2}^{2}+X_{1} X_{3}+X_{3}^{2}\) and \(X_{1}^{2}-2 X_{1} X_{2}+\frac{2}{3} X_{2}^{2}-\) \(2 X_{1} X_{2}-X_{3}^{2}\) independent or dependent?
Let \(X_{i j k}, i=1, \ldots, a ; j=1, \ldots, b, k=1, \ldots, c\), be a random sample of size \(n=a b c\) from a normal distribution \(N\left(\mu, \sigma^{2}\right) .\) Let \(\bar{X}_{\ldots}=\sum_{k=1}^{c} \sum_{j=1}^{b} \sum_{i=1}^{a} X_{i j k} / n\) and $$ \begin{aligned} \bar{X}_{i_{r}} &=\sum_{k=1}^{c} \sum_{j=1}^{b} X_{i j k} / b c . \text { Prove that } \\ & \sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{c}\left(X_{i j k}-\bar{X}_{\ldots}\right)^{2}=\sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{c}\left(X_{i j k}-\bar{X}_{i . .}\right)^{2}+b c \sum_{i=1}^{a}\left(\bar{X}_{i .}-\bar{X}_{\cdots}\right)^{2} \end{aligned} $$ Show that \(\sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{c}\left(X_{i j k}-\bar{X}_{i . .}\right)^{2} / \sigma^{2}\) has a chi-square distribution with \(a(b c-1)\) degrees of freedom. Prove that the two terms in the right-hand member are independent. What, then, is the distribution of \(b c \sum_{i=1}^{a}\left(\bar{X}_{i .}-\bar{X}_{\ldots}\right)^{2} / \sigma^{2} ?\) Furthermore, let \(X_{. j .}=\sum_{k=1}^{c} \sum_{i=1}^{a} X_{i j k} / a c\) and \(\bar{X}_{i j .}=\sum_{k=1}^{c} X_{i j k} / c .\) Show that $$ \begin{aligned} \sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{c}\left(X_{i j k}-\bar{X}_{\cdots}\right)^{2}=& \sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{c}\left(X_{i j k}-\bar{X}_{i j .}\right)^{2} \\ &+b c \sum_{i=1}^{a}\left(\bar{X}_{i_{n}}-\bar{X}_{\ldots}\right)^{2}+a c \sum_{j=1}^{b}\left(\bar{X}_{. j}-\bar{X}_{\ldots}\right)^{2} \\ &+c \sum_{i=1}^{a} \sum_{j=1}^{b}\left(\bar{X}_{i j .}-\bar{X}_{i .}-X_{. j .}+X_{\ldots}\right) \end{aligned} $$ Prove that the four terms in the right-hand member, when divided by \(\sigma^{2}\), are independent chi-square variables with \(a b(c-1), a-1, b-1\), and \((a-1)(b-1)\) degrees of freedom, respectively.
Let the independent random variables \(Y_{1}, Y_{2}, \ldots, Y_{n}\) have, respectively, the probability density functions \(N\left(\beta x_{i}, \gamma^{2} x_{i}^{2}\right), i=1,2, \ldots, n\), where the given numbers \(x_{1}, x_{2}, \ldots, x_{n}\) are not all equal and no one is zero. Find the maximum likelihood estimators of \(\beta\) and \(\gamma^{2}\).
Let \(X_{1}, X_{2}, X_{3}, X_{4}\) be a random sample of size \(n=4\) from the normal distribution \(N(0,1) .\) Show that \(\sum_{i=1}^{4}\left(X_{i}-\bar{X}\right)^{2}\) equals $$ \frac{\left(X_{1}-X_{2}\right)^{2}}{2}+\frac{\left[X_{3}-\left(X_{1}+X_{2}\right) / 2\right]^{2}}{3 / 2}+\frac{\left[X_{4}-\left(X_{1}+X_{2}+X_{3}\right) / 3\right]^{2}}{4 / 3} $$ and argue that these three terms are independent, each with a chi-square distribution with 1 degree of freedom.
The driver of a diesel-powered automobile decided to test the quality of three types of diesel fuel sold in the area based on mpg. Test the null hypothesis that the three means are equal using the following data. Make the usual assumptions and take \(\alpha=0.05\). $$ \begin{array}{llllll} \text { Brand A: } & 38.7 & 39.2 & 40.1 & 38.9 & \\ \text { Brand B: } & 41.9 & 42.3 & 41.3 & & \\ \text { Brand C: } & 40.8 & 41.2 & 39.5 & 38.9 & 40.3 \end{array} $$
What do you think about this solution?
We value your feedback to improve our textbook solutions.