In the context of maximum likelihood estimation, the likelihood function plays a pivotal role. It is a function of the parameter \( \theta \) given the observed data. To construct a likelihood function, we take the product of the probability density functions (pdfs) of all the observations. This product reveals how probable the observed data is, assuming a specific \( \theta \).
- If \( \theta = 1 \), the likelihood function \( L_1(\theta) \) is built from the normal distribution: \ \( L_1(\theta = 1) = \prod_{i=1}^{n} \frac{1}{\sqrt{2 \pi}} e^{-X_i^{2}/2} \).
- For \( \theta = 2 \), the likelihood function \( L_2(\theta) \) uses the Cauchy distribution: \ \( L_2(\theta = 2) = \prod_{i=1}^{n} \frac{1}{\pi(1 + X_i^2)} \).
The goal is to see which likelihood function is maximized given the data points we have. This involves calculating each function's value, given all samples, and determining which scenario (value of \( \theta \)) fits our observations best.