Chapter 9: Problem 12
Fit \(y=a+x\) to the data $$ \begin{array}{l|lll} \mathrm{x} & 0 & 1 & 2 \\ \hline \mathrm{y} & 1 & 3 & 4 \end{array} $$ by the method of least squares.
Chapter 9: Problem 12
Fit \(y=a+x\) to the data $$ \begin{array}{l|lll} \mathrm{x} & 0 & 1 & 2 \\ \hline \mathrm{y} & 1 & 3 & 4 \end{array} $$ by the method of least squares.
All the tools & learning materials you need for study success - in one app.
Get started for freeIf \(A_{1}, A_{2}, \ldots, A_{k}\) are events, prove, by induction, Boole's inequality $$ P\left(A_{1} \cup A_{2} \cup \cdots \cup A_{k}\right) \leq \sum_{1}^{k} P\left(A_{i}\right) $$ Then show that $$ P\left(A_{1}^{c} \cap A_{2}^{c} \cap \cdots \cap A_{k}^{c}\right) \geq 1-\sum_{1}^{b} P\left(A_{i}\right) $$
Show that the square of a noncentral \(T\) random variable is a noncentral \(F\) random variable.
Using the notation of Section 9.2, assume that the means \(\mu_{j}\) satisfy a linear function of \(j\), nanely \(\mu_{j}=c+d[j-(b+1) / 2] .\) Let independent random samples of size \(a\) be taken from the \(b\) normal distributions having means \(\mu_{1}, \mu_{2}, \ldots, \mu_{b}\), respectively, and common unknown variance \(\sigma^{2}\). (a) Show that the maximum likelihood estimators of \(c\) and \(d\) are, respectively, \(\hat{c}=\bar{X}_{. .}\) and $$ \hat{d}=\frac{\sum_{j=1}^{b}[j-(b-1) / 2]\left(\bar{X}_{. j}-\bar{X}_{. .}\right)}{\sum_{j=1}^{b}[j-(b+1) / 2]^{2}} $$ (b) Show that $$ \begin{aligned} \sum_{i=1}^{a} \sum_{j=1}^{b}\left(X_{i j}-\bar{X}_{. .}\right)^{2}=\sum_{i=1}^{a} \sum_{j=1}^{b}\left[X_{i j}-\bar{X}_{. .}-\hat{d}\left(j-\frac{b+1}{2}\right)\right]^{2} \\ &+\hat{d}^{2} \sum_{j=1}^{b} a\left(j-\frac{b+1}{2}\right)^{2} \end{aligned} $$ (c) Argue that the two terms in the right-hand member of Part (b), once divided by \(\sigma^{2}\), are independent random variables with \(\chi^{2}\) distributions provided that \(d=0\) (d) What \(F\) -statistic would be used to test the equality of the means, that is, \(H_{0}: d=0 ?\)
Let \(A\) be the real symmetric matrix of a quadratic form \(Q\) in the observations of a random sample of size \(n\) from a distribution which is \(N\left(0, \sigma^{2}\right)\). Given that \(Q\) and the mean \(\bar{X}\) of the sample are independent, what can be said of the elements of each row (column) of \(\boldsymbol{A}\) ? Hint: Are \(Q\) and \(X^{2}\) independent?
Student's scores on the mathematics portion of the ACT examination, \(x\), and on the final examination in the first-semester calculus ( 200 points possible), \(y\), are given. (a) Calculate the least squares regression line for these data. (b) Plot the points and the least squares regression line on the same graph. (c) Find point estimates for \(\alpha, \beta\), and \(\sigma^{2}\). (d) Find 95 percent confidence intervals for \(\alpha\) and \(\beta\) under the usual assumptions. $$ \begin{array}{cc|cc} \hline \mathrm{x} & \mathrm{y} & \mathrm{x} & \mathrm{y} \\ \hline 25 & 138 & 20 & 100 \\ 20 & 84 & 25 & 143 \\ 26 & 104 & 26 & 141 \\ 26 & 112 & 28 & 161 \\ 28 & 88 & 25 & 124 \\ 28 & 132 & 31 & 118 \\ 29 & 90 & 30 & 168 \\ 32 & 183 & & \\ \hline \end{array} $$
What do you think about this solution?
We value your feedback to improve our textbook solutions.