.: limits :.

     In mathematics, the concept of a "limit" is used to describe the behavior of a function as its argument either "gets close" to some point, or as the argument becomes arbitrarily large; or the behavior of a sequence's elements as their index increases indefinitely. Limits are used in calculus to define derivatives and continuity.

Suppose f(x) is a function, the statement:

   lim f(x) = L
   x --> c

means that f(x) can be made to be as close to L as desired by making x sufficiently close to c. In that case, we say that "the limit of f of x, as x approaches c, is L.

     A related concept to limits as x approaches some finite number is the limit as x approaches positive or negative infinity. This does not literally mean that the difference between x and infinity becomes small, since infinity is not a real number; rather, it means that x either grows without bound positively (positive infinity) or grows without bound negatively (negative infinity).

Suppose f(x) = 2x/(x+1).

  • f(100)     = 1.9802
  • f(1000)   = 1.9980
  • f(10000) = 1.9998

As x becomes extremely large, the value of f(x) approaches 2, and the value of f(x) can be made as close to 2 as one could wish just by picking x sufficiently large. In this case, we say that the limit of f(x) as x approaches infinity is 2.