To Scott's Homepage

Differential Memories

Not everyone who attempts (or is admonished) to experience the joy of differential equations is granted an exceptional math teacher from which to learn. Indeed, most of my forays into post-calculus mathematics in college and graduate school (thus far) have burgeoned with insignificant variables. That is to say, all those variables seem without significance!

Engulfed in a maelstrom of tensors, Wronskians, simple poles, matrix inverses, curls of curls, and logarithmic jetsom too confounding to name, I have longed for the time or clarity with which to stand back, take a deep breath, and honestly ascertain what is so damn interesting about differential equations and their solutions.

So, after a long hiatus full of independent floundering, I asked my Dad, (Val Veirs) for some perspective. Without further ado, here are his differential memories...


Differential Equations that I have known and loved:

A: First Order Rate Equations

-- The joy and the applicability of exponential functions --

dX/dt = r X

has an awful lot going for it -- starting from population growth and radioactive decay and extending to linearized models of countless forms.

Put a couple of these together and you have all of Harte's Spherical Cow examples of chemical and ecological systems.

Toss in a population limiting term and you have growth in a finite renewable environment via the Logistic equation:

dX/dt = r X (1 - X)

which grows (or decays) exponentially when X << 1 and limits its growth when X approaches 1. And here you have a nonlinear differential equation to have fun with!

A nice extension is two interacting predator-prey populations modelled via the Lotka-Volterra equations:

dX/dt = a X - b X Y
dY/dt = -c Y + d X Y

where a and c are growth of prey and decay of predator parameters and b and d are some sort of predation factors linking the populations. All kinds of oscillations and interactions appear.

Or, turn the logistic equation into a difference equation

X(t+1) = a X(t) (1 - X(t))

and iterate it. All is well if a is near 1 (1.05 is 5%/time step growth etc.) but when a is larger than 3, all kinds of strange behavior starts, involving period doubling and chaos. The neatest result from the study of chaos came from looking at this series of period doubling and noticing that lim(n->oo) (a(n) - a(n-1))/(a(n+1) - a(n))=4.6692... a value that is universal for a wide series of nonlinear systems.

Another neat feature of first-order linear differential equations is that they can have complex solutions and thereby bring in oscillatory motions of all kinds.


B. Wave Equations

d2W/dt2 = c2 d2W/dx2

(where all those 2 are supposed to be exponents)

This equation has solutions for W which are waving in space and time and that are translation invariant with parameter (x + c t) corresponding to traveling waves with velocities +c and -c in the x direction. This argument is also often written as (kx - wt) where k, termed the wave number, and omega, termed the angular velocity or frequently) are chosen to scale x and t to insure sinusoidal repetition in one cycle. k = 2 pi/wavelength and w = 2 pi/period and w/k is the velocity. omega = speed * wavenumber is called a dispersion relation with the simplest dispersion relation being of the form d2w/dk2 = 0.

Exponential functions are very nifty here again as we can see that the left side of our wave equation will equal the right side since exponential functions reproduce themselves under differentiation no matter how many times you differentiate.

The waviness in space brings us oh-so-naturally to Fourier series in as many dimensions as we can manage mathematically and with this comes divs, grads and curls and Laplacians. Separation of variables gives us the chance to separate the space and time parts of waves and find (linear) systems that have pretty spatial patterns that are modulated with a simple sinusoidal time variation.


C. Laplace and Poisson

If we drop one side or the other of the wave equation, we get Laplace's equation

del^2 W = 0

or Poisson's equation

del^2 W = rho(x)

These appear in situations where something is conserved (eqn. of continuity) and incompressible (del dot V = 0), so that V can be the gradient of a scalar function, W. Then the continuity eqn. becomes del^2 W = 0, to be solved in 2 or 3 dimensions.

Laplace shows up also in electric field situations where a region is devoid of charges and then the electric potential is determined by the boundaries. If sources exist, then these can be handled by defining a position (and time) dependent source density, rho(x,t).

Instead of being initial value problems, here the solution for W is determined by the values of W around the boundary of whatever region is of interest.


D. The Diffusion Equation

One other interesting form appears when the second derivative wrt x is proportional to the first derivative wrt time.

d2W/dx2 = a dW/dt

or, more generally,

Laplacian of W = a dW/dt + rho

The source term may or may not be zero and the time dependent flow of heat or any gradient-dependent flow quantity can be modelled.

Fick's Law of diffusion makes the assumption that dW/dt is proportional to some concentration gradient and a similar argument is made to paramertize eddy diffusion of moisture or heat or whatever as being proportional to a gradient in the particular quanity.

*M*O*R*E* to come -- soon