Chapter 9: Problem 14
Fit \(y=a+x\) to the data $$ \begin{array}{l|lll} x & 0 & 1 & 2 \\ \hline y & 1 & 3 & 4 \end{array} $$ by the method of least squares.
Chapter 9: Problem 14
Fit \(y=a+x\) to the data $$ \begin{array}{l|lll} x & 0 & 1 & 2 \\ \hline y & 1 & 3 & 4 \end{array} $$ by the method of least squares.
All the tools & learning materials you need for study success - in one app.
Get started for freeShow that $$ R=\frac{\sum_{1}^{n}\left(X_{i}-\bar{X}\right)\left(Y_{i}-\bar{Y}\right)}{\sqrt{\sum_{1}^{n}\left(X_{i}-\bar{X}\right)^{2} \sum_{1}^{n}\left(Y_{i}-\bar{Y}\right)^{2}}}=\frac{\sum_{1}^{n} X_{i} Y_{i}-n \overline{X Y}}{\sqrt{\left(\sum_{1}^{n} X_{i}^{2}-n \bar{X}^{2}\right)\left(\sum_{1}^{n} Y_{i}^{2}-n \bar{Y}^{2}\right)}} $$
By doing the following steps, determine a \((1-\alpha) 100 \%\) approximate
confidence interval for \(\rho\).
(a) For \(0<\alpha<1\), in the usual way, start with \(1-\alpha=P\left(-z_{\alpha
/ 2}
Show that the square of a noncentral \(T\) random variable is a noncentral \(F\) random variable.
Let the independent random variables \(Y_{1}, Y_{2}, \ldots, Y_{n}\) have, respectively, the probability density functions \(N\left(\beta x_{i}, \gamma^{2} x_{i}^{2}\right), i=1,2, \ldots, n\), where the given numbers \(x_{1}, x_{2}, \ldots, x_{n}\) are not all equal and no one is zero. Find the maximum likelihood estimators of \(\beta\) and \(\gamma^{2}\).
Assume that the sample \(\left(x_{1}, Y_{1}\right), \ldots,\left(x_{n},
Y_{n}\right)\) follows the linear model \((9.6 .1)\). Suppose \(Y_{0}\) is a future
observation at \(x=x_{0}-\bar{x}\) and we want to determine a predictive
interval for it. Assume that the model \((9.6 .1)\) holds for \(Y_{0}\); i.e.,
\(Y_{0}\) has a \(N\left(\alpha+\beta\left(x_{0}-\bar{x}\right),
\sigma^{2}\right)\) distribution. We use \(\hat{\eta}_{0}\) of Exercise \(9.6 .6\)
as our prediction of \(Y_{0}\)
(a) Obtain the distribution of \(Y_{0}-\hat{\eta}_{0}\), showing that its
variance is:
$$
V\left(Y_{0}-\hat{\eta}_{0}\right)=\sigma^{2}\left[1+\frac{1}{n}+\frac{\left(x_{0}-\bar{x}\right)^{2}}{\sum_{i=1}^{n}\left(x_{1}-\bar{x}\right)^{2}}\right]
$$
Use the fact that the future observation \(Y_{0}\) is independent of the sample
\(\left(x_{1}, Y_{1}\right), \ldots,\left(x_{n}, Y_{n}\right)\)
(b) Determine a \(t\) -statistic with numerator \(Y_{0}-\hat{\eta}_{0}\).
(c) Now beginning with \(1-\alpha=P\left[-t_{\alpha / 2, n-2}
What do you think about this solution?
We value your feedback to improve our textbook solutions.