Calculus means "a method of calculation or reasoning.'' When one computes the sales tax on a purchase, one employs a simple calculus. When one finds the area of a polygonal shape by breaking it up into a set of triangles, one is using another calculus. Proving a theorem in geometry employs yet another calculus. Despite the wonderful advances in mathematics that had taken place into the first half of the 17th century, mathematicians and scientists were keenly aware of what they could not do. (This is true even today.) In particular, two important concepts eluded mastery by the great thinkers of that time: area and rates of change.
- Area seems innocuous enough; areas of circles, rectangles, parallelograms, etc., are standard topics of study for students today just as they were then. However, the areas of arbitrary shapes could not be computed, even if the boundary of the shape could be described exactly.
- Rates of change were also important. When an object moves at a constant rate of change, then "distance = rate \(\times \) time.'' But what if the rate is not constant -- can distance still be computed? Or, if distance is known, can we discover the rate of change?
It turns out that these two concepts were related. Two mathematicians, Sir Isaac Newton and Gottfried Leibniz, are credited with independently formulating a system of computing that solved the above problems and showed how they were connected. Their system of reasoning was "a'' calculus. However, as the power and importance of their discovery took hold, it became known to many as "the'' calculus. Today, we generally shorten this to discuss "calculus.'' The foundation of "the calculus'' is the limit. It is a tool to describe a particular behavior of a function. This chapter begins our study of the limit by approximating its value graphically and numerically. After a formal definition of the limit, properties are established that make "finding limits'' tractable. Once the limit is understood, then the problems of area and rates of change can be approached.
- 1.1: An Introduction to Limits
- The foundation of "the calculus'' is the limit. It is a tool to describe a particular behavior of a function. This chapter begins our study of the limit by approximating its value graphically and numerically. After a formal definition of the limit, properties are established that make "finding limits'' tractable. Once the limit is understood, then the problems of area and rates of change can be approached.
- 1.2: Epsilon-Delta Definition of a Limit
- This section introduces the formal definition of a limit. Many refer to this as "the epsilon--delta,'' definition, referring to the letters ϵ and δ of the Greek alphabet.
- 1.3: Finding Limits Analytically
- Recognizing that ϵ-δ proofs are cumbersome, this section gives a series of theorems which allow us to find limits much more quickly and intuitively. One of the main results of this section states that many functions that we use regularly behave in a very nice, predictable way. In the next section we give a name to this nice behavior; we label such functions as continuous. Defining that term will require us to look again at what a limit is and what causes limits to not exist.
- 1.4: One Sided Limits
- The previous section gave us tools (which we call theorems) that allow us to compute limits with greater ease. Chief among the results were the facts that polynomials and rational, trigonometric, exponential and logarithmic functions (and their sums, products, etc.) all behave "nicely.'' In this section we rigorously define what we mean by "nicely.''
- 1.5: Continuity
- As we have studied limits, we have gained the intuition that limits measure 'where a function is heading.' We have seen, though, that this is not necessarily a good indicator of what the function actually is. This can be problematic; functions can tend to one value, but attain another. This section focuses on functions that do not exhibit such behavior.
- 1.6: Limits Involving Infinity
- In Definition 1 we stated that in the equation lim x→cf(x)=L, both c and L were numbers. In this section we relax that definition a bit by considering situations when it makes sense to let c and/or L be "infinity.''