Chapter 5: Problem 6
Consider the equation $$\begin{aligned}y &=\beta_{0}+\beta_{1} x+\beta_{2} x^{2}+u \\\E(u | x) &=0\end{aligned}$$ where the explanatory variable \(x\) has a standard normal distribution in the population. In particular, \(\mathrm{E}(x)=0, \mathrm{E}\left(x^{2}\right)=\operatorname{Var}(x)=1,\) and \(\mathrm{E}\left(x^{3}\right)=0 .\) This last condition holds because the standard normal distribution is symmetric about zero. We want to study what we can say about the OLS estimator of \(\beta_{1}\) we omit \(x^{2}\) and compute the simple regression estimator of the intercept and slope. i. Show that we can write $$y=a_{0}+\beta_{1} x+v$$ where \(\mathbf{E}(v)=0 .\) In particular, find \(v\) and the new intercept, \(a_{0}.\) ii. Show that \(\mathrm{E}(v | x)\) depends on \(x\) unless \(\beta_{2}=0\). iii. Show that \(\operatorname{Cov}(x, v)=0\). iv. If \(\hat{\beta}_{1}\) is the slope coefficient from regression \(y_{i}\) on \(x_{i}\), is \(\widehat{\beta}_{1}\) consistent for \(\beta_{1}\) ? Is it unbiased? Explain. v. Argue that being able to estimate \(\beta_{1}\) has some value in the following sense: \(\beta_{1}\) is the partial effect of \(x\) on \(y\) evaluated at \(x=0,\) the average value of \(x\). vi. Explain why being able to consistently estimate \(\beta_{1}\) and \(\beta_{2}\) is more valuable than just estimating \(\beta_{1}\).
Short Answer
Step by step solution
Key Concepts
These are the key concepts you need to understand to accurately answer the question.