Chapter 8: Problem 17
Consider the straight-line regression model \(y_{j}=\alpha+\beta x_{j}+\sigma \varepsilon_{j}, j=1, \ldots, n\). Suppose that \(\sum x_{j}=0\) and that the \(\varepsilon_{j}\) are independent with means zero, variances \(\varepsilon\), and common density \(f(\cdot)\) (a) Write down the variance of the least squares estimate of \(\beta\). (b) Show that if \(\sigma\) is known, the log likelihood for the data is $$ \ell(\alpha, \beta)=-n \log \sigma+\sum_{j=1}^{n} \log f\left(\frac{y_{j}-\alpha-\beta x_{j}}{\sigma}\right) $$ derive the expected information matrix for \(\alpha\) and \(\beta\), and show that the asymptotic variance of the maximum likelihood estimate of \(\beta\) can be written as \(\sigma^{2} /\left(i \sum x_{j}^{2}\right)\), where $$ i=\mathrm{E}\left\\{-\frac{d^{2} \log f(\varepsilon)}{d \varepsilon^{2}}\right\\} $$ Hence show that the the least squares estimate of \(\beta\) has asymptotic relative efficiency \(i / v \times 100 \%\) (c) Show that the cumulant-generating function of the Gumbel distribution, \(f(u)=\) \(\exp \\{-u-\exp (-u)\\},-\infty
Short Answer
Step by step solution
Key Concepts
These are the key concepts you need to understand to accurately answer the question.