Chapter 9: Problem 13
Fit by the method of least squares the plane \(z=a+b x+c y\) to the five points \((x, y, z):(-1,-2,5),(0,-2,4),(0,0,4),(1,0,2),(2,1,0)\).
Chapter 9: Problem 13
Fit by the method of least squares the plane \(z=a+b x+c y\) to the five points \((x, y, z):(-1,-2,5),(0,-2,4),(0,0,4),(1,0,2),(2,1,0)\).
All the tools & learning materials you need for study success - in one app.
Get started for freeLet \(X_{1}\) and \(X_{2}\) be two independent random variables. Let \(X_{1}\) and
\(Y=\) \(X_{1}+X_{2}\) be \(\chi^{2}\left(r_{1}, \theta_{1}\right)\) and
\(\chi^{2}(r, \theta)\), respectively. Here \(r_{1}
Using the background of the two-way classification with one observation per cell, show that the maximum likelihood estimator of \(\alpha_{i}, \beta_{j}\), and \(\mu\) are \(\hat{\alpha}_{i}=\bar{X}_{i .}-\bar{X}_{. .}\) \(\hat{\beta}_{j}=\bar{X}_{. j}-\bar{X}_{. .}\), and \(\hat{\mu}=\bar{X}_{. .}\), respectively. Show that these are unbiased estimators of their respective parameters and compute \(\operatorname{var}\left(\hat{\alpha}_{i}\right), \operatorname{var}\left(\hat{\beta}_{j}\right)\), and \(\operatorname{var}(\hat{\mu})\).
Let the independent random variables \(Y_{1}, \ldots, Y_{n}\) have the joint pdf. $$ L\left(\alpha, \beta, \sigma^{2}\right)=\left(\frac{1}{2 \pi \sigma^{2}}\right)^{n / 2} \exp \left\\{-\frac{1}{2 \sigma^{2}} \sum_{1}^{n}\left[y_{i}-\alpha-\beta\left(x_{i}-\bar{x}\right)\right]^{2}\right\\} $$ where the given numbers \(x_{1}, x_{2}, \ldots, x_{n}\) are not all equal. Let \(H_{0}: \beta=0(\alpha\) and \(\sigma^{2}\) unspecified). It is desired to use a likelihood ratio test to test \(H_{0}\) against all possible alternatives. Find \(\Lambda\) and see whether the test can be based on a familiar statistic. Hint: In the notation of this section show that $$ \sum_{1}^{n}\left(Y_{i}-\hat{\alpha}\right)^{2}=Q_{3}+\widehat{\beta}^{2} \sum_{1}^{n}\left(x_{i}-\bar{x}\right)^{2} $$
Let \(X_{1}, X_{2}, X_{3}, X_{4}\) denote a random sample of size 4 from a distribution which is \(N\left(0, \sigma^{2}\right) .\) Let \(Y=\sum_{1}^{4} a_{i} X_{i}\), where \(a_{1}, a_{2}, a_{3}\), and \(a_{4}\) are real constants. If \(Y^{2}\) and \(Q=X_{1} X_{2}-X_{3} X_{4}\) are independent, determine \(a_{1}, a_{2}, a_{3}\), and \(a_{4}\).
Assume that the sample \(\left(x_{1}, Y_{1}\right), \ldots,\left(x_{n},
Y_{n}\right)\) follows the linear model \((9.6 .1)\). Suppose \(Y_{0}\) is a future
observation at \(x=x_{0}-\bar{x}\) and we want to determine a predictive
interval for it. Assume that the model \((9.6 .1)\) holds for \(Y_{0}\); i.e.,
\(Y_{0}\) has a \(N\left(\alpha+\beta\left(x_{0}-\bar{x}\right),
\sigma^{2}\right)\) distribution. We will use \(\hat{\eta}_{0}\) of Exercise \(9.6
.4\) as our prediction of \(Y_{0}\)
(a) Obtain the distribution of \(Y_{0}-\hat{\eta}_{0}\). Use the fact that the
future observation \(Y_{0}\) is independent of the sample \(\left(x_{1},
Y_{1}\right), \ldots,\left(x_{n}, Y_{n}\right)\)
(b) Determine a \(t\) -statistic with numerator \(Y_{0}-\hat{\eta}_{0}\).
(c) Now beginning with \(1-\alpha=P\left[-t_{\alpha / 2, n-2}
What do you think about this solution?
We value your feedback to improve our textbook solutions.