The precise definition of a limit centers around making the concept of approaching a value as exact as possible. For a function \( f(x) \), saying the limit as \( x \) approaches a value \( a \) is \( L \) means that no matter how tight the band around \( L \) is (determined by \( \varepsilon \)), we can find an appropriate range (determined by \( \delta \)) for \( x \). This ensures \( f(x) \) stays within the \( \varepsilon \)-distance of \( L \).
For instance, if considering \( \lim_{x \to 2} (x^2 + 3x) = 10 \), we want to show that \( (x^2 + 3x) \) is as near to 10 as desired when \( x \) is near 2. Here,
By selecting any \( \varepsilon \), you find a corresponding \( \delta \) so that when \( x \) is in the interval \( (2 - \delta, 2 + \delta) \), \( f(x) \) will remain confined within \( (10 - \varepsilon, 10 + \varepsilon) \). This quantitative approach is what makes limits reliable and rigorously proven in calculus.