The MSE of an estimator\(\hat \theta \)can be written as,
\({\bf{MSE(\hat \theta ) = Var(\hat \theta ) + Bia}}{{\bf{s}}^{\bf{2}}}\)
As the parameter is proved to be an unbiased estimated, the bias would be 0.
Therefore,
\(\begin{array}{c}MSE(\hat \theta ) = Var(\hat \theta )\\ = Var\left( {5 - 4{{\hat \beta }_0} + {{\hat \beta }_1}} \right)\\ = Var\left( {4{{\hat \beta }_0}} \right) + Var\left( {{{\hat \beta }_1}} \right) - 2Cov\left( {4{{\hat \beta }_0},{{\hat \beta }_1}} \right)\\ = {4^2}Var\left( {{{\hat \beta }_0}} \right) + Var\left( {{{\hat \beta }_1}} \right) - 2Cov\left( {4{{\hat \beta }_0},{{\hat \beta }_1}} \right)\\ = 16Var\left( {{{\hat \beta }_0}} \right) + Var\left( {{{\hat \beta }_1}} \right) - 8Cov\left( {{{\hat \beta }_0},{{\hat \beta }_1}} \right)\end{array}\)
The variances of the estimated parameters \({\hat \beta _0}\) and \({\hat \beta _1}\) are,
\(\begin{array}{l}Var\left( {{{\hat \beta }_0}} \right) = {\sigma ^2}\left( {\frac{1}{n} + \frac{{{{\bar x}^2}}}{{s_x^2}}} \right)\quad {\rm{ }}\\Var\left( {{{\hat \beta }_1}} \right) = \frac{{{\sigma ^2}}}{{s_x^2}}.\end{array}\)
Also, the covariance of \({\hat \beta _0}\) and \({\hat \beta _1}\) is given as,
\(Cov\left( {{{\hat \beta }_0},{{\hat \beta }_1}} \right) = - \frac{{\bar x{\sigma ^2}}}{{s_x^2}}\)
From exercise 12 and 13,
\(\begin{array}{l}Var\left( {{{\hat \beta }_0}} \right) = 0.607{\sigma ^2}\\Var\left( {{{\hat \beta }_1}} \right) = 0.095{\sigma ^2}\\Cov\left( {{{\hat \beta }_0},{{\hat \beta }_1}} \right) = - 0.214{\sigma ^2}\end{array}\)
Substitute the values as follows,
\(\begin{array}{l}MSE(\hat \theta ) = 16\left( {0.607{\sigma ^2}} \right) + 0.095{\sigma ^2} + 8\left( {0.214{\sigma ^2}} \right)\\MSE(\hat \theta ) = 11.519{\sigma ^2}\end{array}\)
Hence, the MSE of \(\hat \theta \)is \({\mathop{\rm MSE}\nolimits} (\hat \theta ) = 11.519{\sigma ^2}\)