Chapter 9: Problem 13
Fit by the method of least squares the plane \(z=a+b x+c y\) to the five points \((x, y, z):(-1,-2,5),(0,-2,4),(0,0,4),(1,0,2),(2,1,0)\).
Chapter 9: Problem 13
Fit by the method of least squares the plane \(z=a+b x+c y\) to the five points \((x, y, z):(-1,-2,5),(0,-2,4),(0,0,4),(1,0,2),(2,1,0)\).
All the tools & learning materials you need for study success - in one app.
Get started for freeLet \(A\) be the real symmetric matrix of a quadratic form \(Q\) in the observations of a random sample of size \(n\) from a distribution which is \(N\left(0, \sigma^{2}\right)\). Given that \(Q\) and the mean \(\bar{X}\) of the sample are independent, what can be said of the elements of each row (column) of \(\boldsymbol{A}\) ? Hint: Are \(Q\) and \(X^{2}\) independent?
Let \(\mu_{1}, \mu_{2}, \mu_{3}\) be, respectively, the means of three normal distributions with a common but unknown variance \(\sigma^{2}\). In order to test, at the \(\alpha=5\) percent significance level, the hypothesis \(H_{0}: \mu_{1}=\mu_{2}=\mu_{3}\) against all possible alternative hypotheses, we take an independent random sample of size 4 from each of these distributions. Determine whether we accept or reject \(H_{0}\) if the observed values from these three distributions are, respectively, $$ \begin{array}{lrrrr} X_{1}: & 5 & 9 & 6 & 8 \\ X_{2}: & 11 & 13 & 10 & 12 \\ X_{3}: & 10 & 6 & 9 & 9 \end{array} $$
Show that \(\sum_{i=1}^{n}\left[Y_{i}-\alpha-\beta\left(x_{i}-\bar{x}\right)\right]^{2}=n(\hat{\alpha}-\alpha)^{2}+(\hat{\beta}-\beta)^{2} \sum_{i=1}^{n}\left(x_{i}-\bar{x}\right)^{2}+\sum_{i=1}^{n}\left[Y_{i}-\hat{\alpha}-\hat{\beta}\left(x_{i}-\bar{x}\right)\right]^{2} .\)
Using the notation of this section, assume that the means satisfy the condition that \(\mu=\mu_{1}+(b-1) d=\mu_{2}-d=\mu_{3}-d=\cdots=\mu_{b}-d .\) That is, the last \(b-1\) means are equal but differ from the first mean \(\mu_{1}\), provided that \(d \neq 0\). Let independent random samples of size \(a\) be taken from the \(b\) normal distributions with common unknown variance \(\sigma^{2}\). (a) Show that the maximum likelihood estimators of \(\mu\) and \(d\) are \(\hat{\mu}=\bar{X} . .\) and $$ \hat{d}=\frac{\sum_{j=2}^{b} \bar{X}_{. j} /(b-1)-\bar{X}_{.1}}{b} $$ (b) Using Exercise \(9.1 .3\), find \(Q_{6}\) and \(Q_{7}=c \hat{d}^{2}\) so that, when \(d=0, Q_{7} / \sigma^{2}\) is \(\chi^{2}(1)\) and $$ \sum_{i=1}^{a} \sum_{j=1}^{b}\left(X_{i j}-\bar{X}_{n}\right)^{2}=Q_{3}+Q_{6}+Q_{7} $$ (c) Argue that the three terms in the right-hand member of Part (b), once divided by \(\sigma^{2}\), are independent random variables with chi-square distributions, provided that \(d=0\). (d) The ratio \(Q_{7} /\left(Q_{3}+Q_{6}\right)\) times what constant has an \(F\) -distribution, provided that \(d=0\) ? Note that this \(F\) is really the square of the two-sample \(T\) used to test the equality of the mean of the first distribution and the common mean of the other distributions, in which the last \(b-1\) samples are combined into one.
Let \(X_{i j k}, i=1, \ldots, a ; j=1, \ldots, b, k=1, \ldots, c\), be a random sample of size \(n=a b c\) from a normal distribution \(N\left(\mu, \sigma^{2}\right) .\) Let \(\bar{X}_{\ldots}=\sum_{k=1}^{c} \sum_{j=1}^{b} \sum_{i=1}^{a} X_{i j k} / n\) and $$ \begin{aligned} \bar{X}_{i_{r}} &=\sum_{k=1}^{c} \sum_{j=1}^{b} X_{i j k} / b c . \text { Prove that } \\ & \sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{c}\left(X_{i j k}-\bar{X}_{\ldots}\right)^{2}=\sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{c}\left(X_{i j k}-\bar{X}_{i . .}\right)^{2}+b c \sum_{i=1}^{a}\left(\bar{X}_{i .}-\bar{X}_{\cdots}\right)^{2} \end{aligned} $$ Show that \(\sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{c}\left(X_{i j k}-\bar{X}_{i . .}\right)^{2} / \sigma^{2}\) has a chi-square distribution with \(a(b c-1)\) degrees of freedom. Prove that the two terms in the right-hand member are independent. What, then, is the distribution of \(b c \sum_{i=1}^{a}\left(\bar{X}_{i .}-\bar{X}_{\ldots}\right)^{2} / \sigma^{2} ?\) Furthermore, let \(X_{. j .}=\sum_{k=1}^{c} \sum_{i=1}^{a} X_{i j k} / a c\) and \(\bar{X}_{i j .}=\sum_{k=1}^{c} X_{i j k} / c .\) Show that $$ \begin{aligned} \sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{c}\left(X_{i j k}-\bar{X}_{\cdots}\right)^{2}=& \sum_{i=1}^{a} \sum_{j=1}^{b} \sum_{k=1}^{c}\left(X_{i j k}-\bar{X}_{i j .}\right)^{2} \\ &+b c \sum_{i=1}^{a}\left(\bar{X}_{i_{n}}-\bar{X}_{\ldots}\right)^{2}+a c \sum_{j=1}^{b}\left(\bar{X}_{. j}-\bar{X}_{\ldots}\right)^{2} \\ &+c \sum_{i=1}^{a} \sum_{j=1}^{b}\left(\bar{X}_{i j .}-\bar{X}_{i .}-X_{. j .}+X_{\ldots}\right) \end{aligned} $$ Prove that the four terms in the right-hand member, when divided by \(\sigma^{2}\), are independent chi-square variables with \(a b(c-1), a-1, b-1\), and \((a-1)(b-1)\) degrees of freedom, respectively.
What do you think about this solution?
We value your feedback to improve our textbook solutions.