If you took calculus in school, you probably remember it as a blurry mismash of baffling equations, difficult algebra, many seemingly irrelevant “real-world” examples and overall confusion. The blame for this is usually left squarely on the academic establishment, who are certainly at fault, but there is more to it than that.

What is calculus?

The word “calculus” is actually a generic word for a mathematical system. There are lots of other calculii; computer science has lambda, pi and join calculii. So why do we call Newton’s calculus “calculus” and not something more specific? Looking at history, you find that “calculus” used to be referred to as “infinitesimal calculus.” This is an accurate and appropriate name, because calculus is the foundation of studying rates of change and summations over time. These are things you need in Newtonian physics, because you’re interested in saying, if I have this function of distance, what is the rate of change, in other words, the velocity. Alternately, if I have a function of distance, what is the total distance?

Newton conceived this infinitesimal calculus as merely this tool he needed to solve certain general physics problems like the above. Velocity is defined by direction plus change in position over the length of time, but there could be such a thing as an instantaneous velocity; everybody with a car is familiar with this notion, it’s your speedometer. The total distance, likewise, is the sum of all of the individual distances moved, instant to instant, along this function.

The tools provided, the derivative and the integral, happen to be incredibly general tools. This leads to the first problem of calculus instruction: lack of clarity about what calculus is. Frankly, this is a lot like the confusion in Haskell with monads. They’re not a difficult concept, they’re actually quite simple. So simple, the applications are unlimited. So then you get unlimited examples, which wind up creating more confusion. Instead of teaching math, metaphor gets taught, and we pretend that distilling the commonality between six metaphors is easier than just understanding the math.

The second problem with calculus instruction is that we no longer use infinitesimals! No wonder the word has fallen out of use; how hard would it be to understand infinitesimal calculus without infinitesimals in it!

Now, Newton and Leibniz managed to make great use of calculus, but they never improved on their own informal definition of what an infinitesimal was. This kind of thing leads to great discomfort in mathematicians. Other mathematicians (Bolzano, Cauchy) found a way to skirt the issue by defining the limit formally without using infinitesimals. From there, all the rest of calculus can be rebuilt. Thanks to Karl Weierstrass, this is the formulation used by textbooks today: a formulation without actual infinitesimals, but retaining Leibniz’s notation that uses them.

It would probably be impossible to understand or use calculus without an intuitive grasp of infinitesimals, yet all the textbooks for calculus instead use the epsilon-sigma definition of the limit as their basis. This leads to absurdities like learning the formal definition of the limit after two weeks of computing them with various formulas and heuristics, which my wife is enduring now. Furthermore, the restatements of the derivative and the integral with limits are less intuitive. This should be obvious, since Newton and Leibniz started with derivatives and integrals respectively and not limits, which are completely tangental to all of the things they actually wanted to calculate. The end result is that students are expected to perform derivatives and integrals using a formalization that bears no resemblence to the intuition they’re being taught—which they need to understand what they’re doing. And the book must also include tons of “real world examples” in order to get into print and into schools.

infinitesimal calculus with infinitesimals

The one fundamental problem was that Newton and Leibniz depended on an informal tool to build their formal system. The solution Bolzano found was to eliminate the use of that tool. There is another solution, found in the 20th century by Abraham Robinson, which is to formalize the concept of the infinitesimal.

Robinson defines a new set of numbers, the hyperreals. This is the real numbers plus the infinitesimal. The infinitesimal is a number greater than zero but less than any positive real. He then provides algebraic laws for manipulating the infinitesimal: it times itself is itself, it plus itself is itself, and so on. He then defines a relationship, “infinite closeness” which can be essentially used to discard infinitesimals.

After learning this fairly simple and somewhat intuitive notion, you can proceed directly to taking derivatives, just like Newton did. You can then proceed to taking integrals, the inverse operation, by summing an infinite number of infinitesimally-thin strips to calculate area under a curve. In both cases, you’ve really just learned a small extension to algebra and are doing algebraic expansions to do calculus. You’ve built up the two fundamental operations of calculus without introducing the limit or the clumsy reformulation of those operations in terms of it.

How does this help me?

If you’re interested in knowing calculus, rather than just passing the class, I suggest you read the book Elementary Calculus: An infinitesimal Approach (or buy the new edition). This book follows exactly the approach. I’m going through it now myself, and I’m finding the process really enlightening. There is also research that shows that using the infinitesimal approach leads to greater comprehension of the material.

This method is called “non-standard calculus” and is considered heretical by some mathematicians, notably constructivists and intuitionists. The details of their interpretation of mathematics are interesting, but not particularly relevant apart from the one detail that constructivism is related to computability, but I see Juha Ruokolainen has developed a constructivist version of nonstandard analysis (even without infinity) that addresses the complaints, and besides, they themselves are considered heretical by other mathematicians.

Moreover, learning the infinitesimal approach gives you a general problem solving tool that can extend beyond calculus. I have a hard time imagining what such a thing might look like, but I find the idea very appealing, in part because nobody has given me seventy five “real world” examples, but you get the gist.