"Even as the finite encloses an infinite series
And in the unlimited limits appear,
So the soul of immensity dwells in minutia
And in the narrowest limits, no limits inhere
What joy to discern the minute in infinity!
The vast to perceive in the small, what Divinity!"
- Jakob Bernoulli
As noted by Varberg, Rigdon, and Purcell in Calculus Eigth Edition, the calculus is the study, or analysis, of limits. While algebra focuses on operating with the unknown, and geometry defines the relationships of space, calculus defines the imperceptibly small and the infinitely big. So begins our long journey through this new analysis, central to which is the elusive creature known as the differential.
The originators of the calculus are many, but first credit is usually given to Newton - that brilliant scientist and mathematician who wrote the Principia Mathematica and gave us the concept of a derivative and an integral, and the first methods of dealing with infinitely small changes at instants of time. In addition, it is Newton's methods of the calculus that have been rigorously established by the likes of Cauchy, Riemann, and Lagrange.
However, it is from the work of Leibniz that the concept of the differential itself stems. Leibniz worked on the development of the calculus at the same time as Newton, and contributed greatly to mathematical analysis while also writing several notable philosophical works. In particular, Leibniz used this notation for the derivative:
dy d²y
-- or ---
dx dx²
In this case, these are symbols for the first and second derivatives of y with respect to x. However, the key to the differential lies in the reason for using "dx" or "dy". Remember that an average rate of change can be represented this way:
Δy
--
Δx
The beauty of Leibniz's notation lies in recognizing that a derivative is simply a ratio of infinitely small changes. Thus rather than writing "Δx", we simply write "dx", with the d showing an infinitely small change. This infinitely small change is called the differential with respect to x. So, what does this give us? According to the theorems of the calculus, we cannot simply multiply and divide by differentials. We must apply the two new operations of calculus: differentiation and integration. Differentiation, or "taking the derivative", takes an expression and turns one of the variables into a differential, modifying the expression accordingly. For example, when we differentiate over x,
4x7
becomes
28 x6dx.
In reverse, the integral is the infinite sum of differentials. Integrating some function multiplied by its differential reverses the differentiation and gives the antiderivative of that function. This is the heart of the Fundamental Theorem of Calculus. To emphasize: an infinite sum of differentials yields the expression that the differential is obtained from.
However, it is possible to manipulate differentials that are extant in an equation -- you cannot add them in at random, but if there is a dx on one side of an equation you can divide by it to move it to the other side. This technique is known as separation of variables, and is used in many types of differential equations. Unfortunately, it is not entirely legal to do this. Although it works, and whole fields of study are based on the ability to manipulate the differential, the rigorous proof of the calculus does not allow for differentials to be manipulated in this way. It is, after a fashion, cheating.
The puzzle of the differential will intrigue many for years to come.
Sources
Varberg, Rigdon, Purcell.
Calculus 8th Edition.
Copyright © 1999 Prentice Hall.
Personal knowledge and understanding.