This page has been proofread, but needs to be validated.
OUTLINES]
INFINITESIMAL CALCULUS
 547


The differential coefficient

in which p+q+r = n, is formed by differentiating p times with respect to x, q times with respect to y, r times with respect to z, the differentiations being performed in any order. Abbreviated notations are sometimes used in such forms as

or

Differentials of higher orders are introduced by the defining equation

in which the expression (dx·∂/∂x + dy·∂/∂y)n is developed by the binomial theorem in the same way as if dx·∂/∂x and dy·∂/∂y were numbers, and (∂/∂x)r·(∂/∂y)nr ƒ is replaced by ∂nƒ/∂xrynr. When there are more than two variables the multinomial theorem must be used instead of the binomial theorem.

The problem of forming the second and higher differential coefficients of implicit functions can be solved at once by means of partial differential coefficients. For example, if ƒ(x, y) = 0 is the equation defining y as a function of x, we have

The differential expression Xdx + Ydy, in which both X and Y are functions of the two variables x and y, is a total differential if there exists a function ƒ of x and y which is such that

When this is the case we have the relation

(ii.)

Conversely, when this equation is satisfied there exists a function ƒ which is such that

The expression Xdx + Ydy in which X and Y are connected by the relation (ii.) is often described as a “perfect differential.” The theory of the perfect differential can be extended to functions of n variables, and in this case there are 1/2n(n−1) such relations as (ii.).

In the case of a function of two variables x, y an abbreviated notation is often adopted for differential coefficients. The function being denoted by z, we write

for

Partial differential coefficients of the second order are important in geometry as expressing the curvature of surfaces. When a surface is given by an equation of the form z = ƒ(x, y), the lines of curvature are determined by the equation

and the principal radii of curvature are the values of R which satisfy the equation

44. The problem of change of variables was first considered by Brook Taylor in his Methodus incrementorum. In the case considered by Taylor y is expressed as a function of z, and z as a function of x, and it is desired to express the differential coefficients of y with respect to x without eliminating Change of variables. z. The result can be obtained at once by the rules for differentiating a product and a function of a function. We have

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

The introduction of partial differential coefficients enables us to deal with more general cases of change of variables than that considered above. If u, v are new variables, and x, y are connected with them by equations of the type

(i.)

while y is either an explicit or an implicit function of x, we have the problem of expressing the differential coefficients of various orders of y with respect to x in terms of the differential coefficients of v with respect to u. We have

by the rule of the total differential. In the same way, by means of differentials of higher orders, we may express d2y/dx2, and so on.

Equations such as (i.) may be interpreted as effecting a transformation by which a point (u, v) is made to correspond to a point (x, y). The whole theory of transformations, and of functions, or differential expressions, which remain invariant under groups of transformations, has been studied exhaustively by Sophus Lie (see, in particular, his Theorie der Transformationsgruppen, Leipzig, 1888–1893). (See also Differential Equations and Groups).

A more general problem of change of variables is presented when it is desired to express the partial differential coefficients of a function V with respect to x, y, . . . in terms of those with respect to u, v, . . ., where u, v, . . . are connected with x, y, . . . by any functional relations. When there are two variables x, y, and u, v are given functions of x, y, we have

and the differential coefficients of higher orders are to be formed by repeated applications of the rule for differentiating a product and the rules of the type

When x, y are given functions of u, v, . . . we have, instead of the above, such equations as

and ∂V/∂x, ∂V/∂y can be found by solving these equations, provided the Jacobian ∂(x, y)/∂(u, v) is not zero. The generalization of this method for the case of more than two variables need not detain us.

In cases like that here considered it is sometimes more convenient not to regard the equations connecting x, y with u, v as effecting a point transformation, but to consider the loci u = const., v = const. as two “families” of curves. Then in any region of the plane of (x, y) in which the Jacobian ∂(x, y)/∂(u, v) does not vanish or become infinite, any point (x, y) is uniquely determined by the values of u and v which belong to the curves of the two families that pass through the point. Such variables as u, v are then described as “curvilinear coordinates” of the point. This method is applicable to any number of variables. When the loci u = const., . . . intersect each other at right angles, the variables are “orthogonal” curvilinear coordinates. Three-dimensional systems of such coordinates have important applications in mathematical physics. Reference may be made to G. Lamé, Leçons sur les coordonnées curvilignes (Paris, 1859), and to G. Darboux, Leçons sur les coordonnées curvilignes et systèmes orthogonaux (Paris, 1898).

When such a coordinate as u is connected with x and y by a functional relation of the form ƒ(x, y, u) = 0 the curves u = const. are a family of curves, and this family may be such that no two curves of the family have a common point. When this is not the case the points in which a curve ƒ(x, y, u) = 0 is intersected by a curve ƒ(x, y, u + Δu) = 0 tend to limiting positions as Δu is diminished indefinitely. The locus of these limiting positions is the “envelope” of the family, and in general it touches all the curves of the family. It is easy to see that, if u, v are the parameters of two families of curves which have envelopes, the Jacobian ∂(x, y)/∂(u, v) vanishes at all points on these envelopes. It is easy to see also that at any point where the reciprocal Jacobian ∂(u, v)/∂(x, y) vanishes, a curve of the family u touches a curve of the family v.

If three variables x, y, z are connected by a functional relation ƒ(x, y, z) = 0, one of them, z say, may be regarded as an implicit function of the other two, and the partial differential coefficients of z with respect to x and y can be formed by the rule of the total differential. We have

and there is no difficulty in proceeding to express the higher differential coefficients. There arises the problem of expressing the partial differential coefficients of x with respect to y and z in terms of those of z with respect to x and y. The problem is known as that of “changing the dependent variable.” It is solved by applying the rule of the total differential. Similar considerations are applicable to all cases in which n variables are connected by fewer than n equations.

45. Taylor’s theorem can be extended to functions of several variables. In the case of two variables the general formula, with a remainder after n terms, can be written most simply in the form Extension of Taylor’s theorem.

in which