Chapter 7: Problem 3
Suppose that the random variables \(Y_{1}, \ldots, Y_{n}\) are such that $$ \mathrm{E}\left(Y_{j}\right)=\mu, \quad \operatorname{var}\left(Y_{j}\right)=\sigma_{j}^{2}, \quad \operatorname{cov}\left(Y_{j}, Y_{k}\right)=0, \quad j \neq k $$ where \(\mu\) is unknown and the \(\sigma_{j}^{2}\) are known. Show that the linear combination of the \(Y_{j}\) 's giving an unbiased estimator of \(\mu\) with minimum variance is $$ \sum_{j=1}^{n} \sigma_{j}^{-2} Y_{j} / \sum_{j=1}^{n} \sigma_{j}^{-2} $$ Suppose now that \(Y_{j}\) is normally distributed with mean \(\beta x_{j}\) and unit variance, and that the \(Y_{j}\) are independent, with \(\beta\) an unknown parameter and the \(x_{j}\) known constants. Which of the estimators $$ T_{1}=n^{-1} \sum_{j=1}^{n} Y_{j} / x_{j}, \quad T_{2}=\sum_{j=1}^{n} Y_{j} x_{j} / \sum_{j=1}^{n} x_{j}^{2} $$ is preferable and why?
Short Answer
Step by step solution
Key Concepts
These are the key concepts you need to understand to accurately answer the question.