]> No Title

Chapter 2

Rules of calculus.

2.1 Superalgebras.

A (commutative associative) superalgebra is a vector space

A=AevenAodd

with a given direct sum decomposition into even and odd pieces, and a map

A×AA

which is bilinear, satisfies the associative law for multiplication, and

Aeven×Aeven Aeven

Aeven×Aodd Aodd

Aodd×Aeven Aodd

Aodd×Aodd Aeven

ωσ = σω if either ω or σ are even,

ωσ = -σω if both ω and σ are odd.

We write these last two conditions as

ωσ=(-1)degσdegωσω.

Here degτ=0 if τ is even, and degτ=1(mod 2) if τ is odd.

2.2 Differential forms.

A linear differential form on a manifold, M, is a rule which assigns to each pM a linear function on TMp. So a linear differential form, ω, assigns to each p an element of TMp*. We will, as usual, only consider linear differential forms which are smooth.

31

32

CHAPTER 2. RULES OF CALCULUS.

The superalgebra, Ω(M) is the superalgebra generated by smooth functions on M (taken as even) and by the linear differential forms, taken as odd.

Multiplication of differential forms is usually denoted by . The number of differential factors is called the degree of the form. So functions have degree zero, linear differential forms have degree one.

In terms of local coordinates, the most general linear differential form has an expression as a1dx1+...+andxn (where the ai are functions). Expressions of the form

a12dx1dx2+a13dx1dx3+...+an-1,ndxn-1dxn

have degree two (and are even). Notice that the multiplication rules require

dxidxj=-dxjdxi

and, in particular, dxidxi=0. So the most general sum of products of two linear differential forms is a differential form of degree two, and can be brought to the above form, locally, after collections of coefficients. Similarly, the most general differential form of degree kn in n dimensional manifold is a sum, locally, with function coefficients, of expressions of the form

dxi1...dxik, i1<...<ik.

There are (nk) such expressions, and they are all even, if k is even, and odd if k is odd.

2.3 The d operator.

There is a linear operator d acting on differential forms called exterior differ- entiation, which is completely determined by the following rules: It satisfies Leibniz' rule in the ""super“ form

d(ωσ)=(dω)σ+(-1)degωω(dσ) .

On functions it is given by

df=fx1dx1+...+fxndxn

and, finally,

d(dxi)=0.

Since functions and the dxi generate, this determines d completely. For example, on linear differential forms

ω=a1dx1+...andxn

2.4. DERIVATIONS.

33

we have

dω = da1dx1+...+dandxn

= (a1x1dx1+...a1xndxn)dx1+...

(anx1dx1+...+anxndxn)dxn

= (a2x1-a1x2)dx1dx2+...+(anxn-1-an-1xn)dxn-1dxn.

In particular, equality of mixed derivatives shows that d2f=0, and hence that d2ω=0 for any differential form. Hence the rules to remember about d are:

d(ωσ) = (dω)σ+(-1)degωω(dσ)

d2 = 0

df = fx1dx1+...+fxndxn.

2.4 Derivations.

A linear operator ' : AA is called an odd derivation if, like d, it satisfies

:AevenAodd, :AoddAeven

and

(ωσ)=(ω)σ+(-1)degωωσ.

A linear map ' : AA,

:AevenAeven, :AoddAodd

satisfying

(ωσ)=(ω)σ+ω(σ)

is called an even derivation. So the Leibniz rule for derivations, even or odd, is

(ωσ)=(ω)σ+(-1)degdegωlωσ.

Knowing the action of a derivation on a set of generators of a superalgebra determines it completely. For example, the equations

d(xi)=dxi, d(dxi)=0i

implies that

dp=px1dx1+...+pxndxn

for any polynomial, and hence determines the value of d on any differential form with polynomial coefficients. The local formula we gave for df where f is any

34

CHAPTER 2. RULES OF CALCULUS.

differentiable function, was just the natural extension (by continuity, if you like) of the above formula for polynomials.

The sum of two even derivations is an even derivation, and the sum of two odd derivations is an odd derivation.

The composition of two derivations will not, in general, be a derivation, but an instructive computation from the definitions shows that the commutator

[1, 2]:=-(-1)degegl1dl2 2O1

is again a derivation which is even if both are even or both are odd, and odd if one is even and the other odd.

A derivation followed by a multiplication is again a derivation: specifically, let ' be a derivation (even or odd) and let τ be an even or odd element of A. Consider the map

ωτω.

We have

τ(ωσ) = (τω)σ+(-1)degdegωlτωσ

= (τω)σ+(-1)(dl+egdegτ)degωω(τσ)

so ωτω is a derivation whose degree is

degτ+deg.

2.5 Pullback.

Let φ : MN be a smooth map. Then the pullback map φ* is a linear map that sends differential forms on N to differential forms on M and satisfies

φ*(ωσ) = φ*ωφ*σ

φ*dω = dφ*ω

(φ*f) = fφ.

The first two equations imply that φ* is completely determined by what it does on functions. The last equation says that on functions, φ* is given by ""substitution“: In terms of local coordinates on M and on Nφ is given by

φ(x1, ..., xm) = (y1, ..., yn)

yi = φi(x1, ..., xm)i=1, ..., n

where the φi are smooth functions. The local expression for the pullback of a function f(y1, ..., yn) is to substitute φi for the yis as into the expression for f so as to obtain afunction of the xs.

It is important to observe that the pull back on differential forms is de- fined for any smooth map, not merely for diffeomorphisms. This is the great advantage of the calculus of differential forms.

2.6. CHAIN RULE.

35

2.6 Chain rule.

Suppose that ψ : NP is a smooth map so that the composition

φψ:MP

is again smooth. Then the chain rule says

(φψ)*=ψ*φ*.

On functions this is essentially a tautology-it is the associativity of composition: f(φψ)=(fφ)ψ. But since pull-back is completely determined by what it does on functions, the chain rule applies to differential forms of any degree.

2.7 Lie derivative.

Let φt be a one parameter group of transformations of M. If ω is a differential form, we get a family of differential forms, φt*ω depending differentiably on t, and so we can take the derivative at t=0:

ddt(φt*ω)|t=0=limt=01t[φt*ω-ω].

Since φt*(ωσ)=φt*ωφt*σ it follows from the Leibniz argument that

φ:ωddt(φt*ω)|t=0

is an even derivation. We want a formula for this derivation.

Notice that since φt*d=dφt* for all t, it follows by differentiation that

φd=dφ

and hence the formula for φ is completely determined by how it acts on func- tions.

Let X be the vector field generating φt. Recall that the geometrical signifi- cance of this vector field is as follows: If we fix a point x, then

tφt(x)

is a curve which passes through the point x at t=0. The tangent to this curve at t=0 is the vector X(x) . In terms of local coordinates, X has coordinates X=(X1, ..., Xn) where Xi(x) is the derivative of φi(t, x1, ..., xn) with respect to t at t=0. The chain rule then gives, for any function f,

φf = ddtf(φ1(t, x1, ..., xn), ..., φn(t, x1, ..., xn))|t=0

= X1fx1+...+Xnfxn.

36

CHAPTER 2. RULES OF CALCULUS.

For this reason we use the notation

X=X1x1+...+Xnxn

so that the differential operator

fXf

gives the action of φ on functions.

As we mentioned, this action of φ on functions determines it completely. In particular, φ depends only on the vector field X, so we may write

φ=Lx

where LX is the even derivation determined by

LXf=Xf, LXd=dLX.

2.8 Weil's formula.

But we want a more explicit formula LX. For this it is useful to introduce an odd derivation associated to X called the interior product and denoted by i(X) . It is defined as follows: First consider the case where

X=xj

and define its interior product by

i(xj)f=0

for all functions while

i(xj)dxk=0, kj

and

i(xj)dxj=1.

The fact that it is a derivation then gives an easy rule for calculating i(/xj) when applied to any differential form: Write the differential form as

ω+dxjσ

where the expressions for ω and σ do not involve dxj. Then

i(xj)[ω+dxjσ]=σ.

2.8. WEIL'S FORMULA.

37

The operator

Xji(xj)

which means first apply i(/xj) and then multiply by the function Xj is again an odd derivation, and so we can make the definition

i(X) :=X1i(x1)+...+Xni(xn) . (2.1)

It is easy to check that this does not depend on the local coordinate system used.

Notice that we can write

Xf=i(X)df.

In particular we have

LXdxj = dLXxj

= dXj

= di(X)dxj.

We can combine these two formulas as follows: Since i(X)f=0 for any function f we have

LXf=di(X)f+i(X)df.

Since ddxj=0 we have

LXdxj=di(X)dxj+i(X)ddxj.

Hence

LX=di(X)+i(X)d=[d, i(X)] (2.2)

when applied to functions or to the forms dxj. But the right hand side of the preceding equation is an even derivation, being the commutator of two odd derivations. So if the left and right hand side agree on functions and on the differential forms dxj they agree everywhere. This equation, (2.2), known as Weil's formula, is a basic formula in differential calculus.

We can use the interior product to consider differential forms of degree k as kmultilinear functions on the tangent space at each point. To illustrate, let σ be a differential form of degree two. Then for any vector field, X, i(X)σ is a linear differential form, and hence can be evaluated on any vector field, Y to produce a function. So we define

σ(X, Y):=[i(X)σ](Y) .

38

CHAPTER 2. RULES OF CALCULUS.

We can use this to express exterior derivative in terms of ordinary derivative and Lie bracket: If θ is a linear differential form, we have

dθ(X, Y) = [i(X)dθ](Y)

i(X)dθ = LXθ-d(i(X)θ)

d(i(X)θ)(Y) = Y[θ(X)]

[LXθ](Y) = LX[θ(Y)]-θ(LX(Y))

= X[θ(Y)]-θ([X, Y])

where we have introduced the notation LXY=:[X, Y] which is legitimate since on functions we have

(LXY)f=LX(Yf)-YLXf=X(Yf)-Y(Xf)

so LXY as an operator on functions is exactly the commutator of X and Y. (See below for a more detailed geometrical interpretation of LXY.) Putting the previous pieces together gives

dθ(X, Y)=Xθ(Y)-Yθ(X)-θ([X, Y (2.3)

with similar expressions for differential forms of higher degree.

2.9 Integration.

Let

ω=fdx1...dxn

be a form of degree n on Rn. (Recall that the most general differential form of degree n is an expression of this type.) Then its integral is defined by

Mω:=Mfdx1...dxn

where M is any (measurable) subset. This,of course is subject to the condition that the right hand side converges if M is unbounded. There is a lot of hidden subtlety built into this definition having to do with the notion of orientation. But for the moment this is a good working definition.

The change of variables formula says that if φ : MRn is a smooth differentiable map which is one to one whose Jacobian determinant is everywhere positive, then

Mφ*ω=φ(M)ω.

2.10 Stokes theorem.

Let U be a region in Rn with a chosen orientation and smooth boundary. We then orient the boundary according to the rule that an outward pointing normal

2.11. LIE DERIVATIVES OF VECTOR FIELDS.

39

vector, together with the a positive frame on the boundary give a positive frame in R. If σ is an (n-1)-form, then

Uσ=Udσ.

A manifold is called orientable if we can choose an atlas consisting of charts such that the Jacobian of the transition maps φαφβ-1 is always positive. Such a choice of an atlas is called an orientation. (Not all manifolds are orientable.) If we have chosen an orientation, then relative to the charts of our orientation, the transition laws for an nform (where n=dimM) and for a density are the same. In other words, given an orientation, we can identify densities with nforms and nform with densities. Thus we may integrate nforms. The change of variables formula then holds for orientation preserving diffeomorphisms as does Stokes theorem.

2.11 Lie derivatives of vector fields.

Let Y be a vector field and φt a one parameter group of transformations whose ""infinitesimal generator“ is some other vector field X. We can consider the ""pulled back“ vector field φt*Y defined by

φt*Y(x)=dφ-t{Y(φtx)}.

In words, we evaluate the vector field Y at the point φt(x) , obtaining a tangent vector at φt(x) , and then apply the differential of the (inverse) map φ-t to obtain a tangent vector at x.

If we differentiate the one parameter family of vector fields φt*Y with respect to t and set t=0 we get a vector field which we denote by LXY:

LXY:=ddtφt*Y|t=0.

If ω is a linear differential form, then we may compute i(Y)ω which is a function whose value at any point is obtained by evaluating the linear function ω(x) on the tangent vector Y(x) . Thus

i(φt*Y)φt*ω(x)={dφt*ω(φtx), dφ-tY(φtx)}={i(Y)ω}(φtx) .

In other words,

φt*{i(Y)ω}=i(φt*Y)φt*ω.

We have verified this when ω is a differential form of degree one. It is trivially true when ω is a differential form of degree zero, i.e. a function, since then both sides are zero. But then, by the derivation property, we conclude that it is true for forms of all degrees. We may rewrite the result in shorthand form as

φt*i(Y)=i(φt*Y)φt*.

40

CHAPTER 2. RULES OF CALCULUS.

Since φt*d=dφt* we conclude from Weil's formula that

φt*LY=Lφt*Yφt*.

Until now the subscript t was superfluous, the formulas being true for any fixed diffeomorphism. Now we differentiate the preceding equations with respect to t and set t=0. We obtain,using Leibniz's rule,

LXi(Y)=i(LXY)+i(Y)LX

and

LXLY=LLXY+LYLX.

This last equation says that Lie derivative (on forms) with respect to the vector field LXY is just the commutator of LX with LY:

LLXY=[LX, LY].

For this reason we write

[X, Y]:=LXY

and call it the Lie bracket (or commutator) of the two vector fields X and Y. The equation for interior product can then be written as

i([X, Y])=[LX, i(Y)].

The Lie bracket is antisymmetric in X and Y. We may multiply Y by a function g to obtain a new vector field gY. Form the definitions we have

φt*(gY)=(φt*g)φt*Y.

Differentiating at t=0 and using Leibniz's rule we get

[X, gY]=(Xg)Y+g[X, Y] (2.4)

where we use the alternative notation Xg for LXg. The antisymmetry then implies that for any differentiable function f we have

[fX, Y]=-(Yf)X+f[X, Y]. (2.5)

From both this equation and from Weil's formula (applied to differential forms of degree greater than zero) we see that the Lie derivative with respect to X at a point x depends on more than the value of the vector field X at x.

2.12 Jacobi's identity.

From the fact that [X, Y] acts as the commutator of X and Y it follows that for any three vector fields X, Y and Z we have

[X, [Y, Z]]+[Z, [X, Y]]+[Y, [Z, X]]=0.