Understanding the limit of a function is a fundamental concept in calculus. When we talk about the limit of a function as it approaches a particular point, we are interested in what value the function gets closer to as the input approaches that point. In mathematical terms, we write this as \( \lim_{x \to a} f(x) = L \), which means that as \( x \) approaches \( a \), the values of \( f(x) \) get arbitrarily close to \( L \).
In the context of the Squeeze Theorem, evaluating limits becomes a strategic process. The given exercise involves using this theorem to find the limit of \( h(x) = x \cos \frac{1}{x} \) as \( x \) approaches 0. Knowing that the function oscillates, the Squeeze Theorem allows us to sandwich \( h(x) \) between two simpler functions whose limits we can easily find, thereby determining the limit of \( h(x) \) itself.
- The Squeeze Theorem is especially useful when a function is difficult to evaluate directly because it oscillates or behaves unpredictably near the point of interest.
- Both bounding functions, in this case \( y = |x| \) and \( y = -|x| \), converge to the same limit (0) as \( x \to 0 \).
Therefore, the limit of the function \( h(x) \) is proven to also approach 0.