Here are a few useful relationships related to the covariance of two random
variables, \(x_{1}\) and \(x_{2}\)
a. \(S h o w \quad t h\) at \(\quad \operatorname{Cov}\left(x_{1},
x_{2}\right)=E\left(x_{1} x_{2}\right)-E\left(x_{1}\right)
E\left(x_{2}\right)\)
An important implication of this is that if \(\operatorname{Cov}\left(x_{1},
x_{2}\right)=0, E\left(x_{1} x_{2}\right)=E\left(x_{1}\right)
E\left(x_{2}\right) .\) That is, the
expected value of a product of two random variables is the product of these
variables' expected values.
b. Show that \(\operatorname{Var}\left(a x_{1}+b x_{2}\right)=a^{2}
\operatorname{Var}\left(x_{1}\right)+b^{2}
\operatorname{Var}\left(x_{2}\right)+\)
\\[
2 a b \operatorname{Cov}\left(x_{1}, x_{2}\right)
\\]
c. In Problem \(2.15 \mathrm{d}\) we looked at the variance of \(X=k x_{1}+(1-k)
x_{2} 0 \leq k \leq 1 .\) Is the conclusion that this variance is minimized for
\(k=0.5\) changed by considering cases where \(\operatorname{Cov}\left(x_{1},
x_{2}\right) \neq 0 ?\)
d. The corrdation cocfficicnt between two random variables is defined as
\\[
\operatorname{Corr}\left(x_{1},
x_{2}\right)=\frac{\operatorname{Cov}\left(x_{1},
x_{2}\right)}{\sqrt{\operatorname{Var}\left(x_{1}\right)
\operatorname{Var}\left(x_{2}\right)}}
\\]
Explain why \(-1 \leq \operatorname{Corr}\left(x_{1}, x_{2}\right) \leq 1\) and
provide some intuition for this result.
e. Suppose that the random variable \(y\) is related to the random variable \(x\)
by the linear equation \(y=\alpha+\beta x\). Show that
\\[
\beta=\frac{\operatorname{Cov}(y, x)}{\operatorname{Var}(x)}
\\]
Here \(\beta\) is sometimes called the (theoretical) regression coefficient of
\(y\) on \(x\). With actual data, the sample analog of this expression is the
ordinary least squares (OLS) regression coefficient.