t

in C 2 (‚ ).

™ QED.

THEOREM 5.4 Let P t be a Markovian semigroup on C(‚ ), not neces-

™

sarily commuting with translations, such that Ccom (‚ ) ⊆ D(A), where A

2

is the in¬nitesimal generator of P t . If for all x in ‚ and all µ > 0

pt (x, {y : |y ’ x| ≥ µ}) = o(t), (5.5)

24 CHAPTER 5

then

‚2

‚

i ij

Af (x) = b (x) i f (x) + a (x) i j f (x) (5.6)

‚x ‚x ‚x

i=1 i,j=1

for all f in Ccom (‚ ), where the aij and bi are real and continuous, and

2

for each x the matrix aij (x) is of positive type.

A matrix aij is of positive type (positive de¬nite, positive semi-de¬nite,

non-negative de¬nite, etc.) in case for all complex ζi ,

¯

ζi aij ζj ≥ 0.

i,j=1

The operator A is not necessarily elliptic since the matrix aij (x) may be

singular. If P t commutes with translations then aij and bi are constants,

of course.

2

Proof. Let f ∈ Ccom (‚ ) and suppose that f together with its ¬rst

2

and second order partial derivatives vanishes at x. Let g ∈ Ccom (‚ ) be

such that g(y) = |y ’ x|2 in a neighborhood of x and g ≥ 0. Let µ > 0 and

let U = {y : |f (y)| ¤ µg(y)}, so that U is a neighborhood of x. By (5.5),

pt (x, ‚ \ U ) = o(t) and so

1

f (y)pt (x, dy)

Af (x) = lim

t’0 t

1 1

f (y)pt (x, dy) ¤ µ lim g(y)pt (x, dy) = µAg(x).

= lim

t’0 t t’0 t

U U

Since µ is arbitrary, Af (x) = 0. This implies that Af (x) is of the

form (5.6) for certain real numbers aij (x), bi (x), and we can assume that

the aij (x) are symmetric. (There is no zero-order term since P t is Marko-

2

vian.) If we apply A to functions in Ccom (‚ ) that in a neighborhood of

x agree with y i ’ xi and (y i ’ xi )(y j ’ xj ), we see that bi and aij are

2

continuous. If f is in Ccom (‚ ) and f (x) = 0 then

1

Af 2 (x) = lim f 2 (y)pt (x, dy) ≥ 0.

t’0 t

DERIVATION OF THE WIENER PROCESS 25

Therefore

‚2f 2

2 ij

Af (x) = a (x) i j (x)

‚x ‚x

i,j=1

‚f ‚f

aij (x) (x) j (x) ≥ 0.

=2

‚xi ‚x

i,j=1

We can choose

‚f

(x) = ξ i

i

‚x

to be arbitrary real numbers, and since aij (x) is real and symmetric, aij (x)

is of positive type. QED.

THEOREM 5.5 Let P t be a Markovian semigroup on C(‚ ) commuting

with translations, and let A be its in¬nitesimal generator. Then

C 2 (‚ ) ⊆ D(A) (5.7)

and P t is determined by A on Ccom (‚ ).

2

Proof. The inclusion (5.7) follows from Theorem 5.3. The proof of

2

that theorem shows that A is continuous from Ccom (‚ ) into C(‚ ), so

that A on Ccom (‚ ) determines A on C 2 (‚ ) by continuity. Since P t

2

commutes with translations, P t leaves C 2 (‚ ) invariant.

Let » > 0. We shall show that (» ’ A)C 2 (‚ ) is dense in C(‚ ).

Suppose not. Then there is a non-zero continuous linear functional z

on C(‚ ) such that z, (» ’ A)f = 0 for all f in C 2 (‚ ). Since C 2 (‚ )

is dense in C(‚ ), there is a g in C 2 (‚ ) with (z, g) = 0. Then

d

(z, P t g) = (z, AP t g) = (z, »P t g) = »(z, P t g)

dt

since P t g is again in C 2 (‚ ). Therefore

(z, P t g) = e»t (z, g)

is unbounded, which is a contradiction. It follows that if Qt is another

2

such semigroup with in¬nitesimal generator B, and B = A on Ccom (‚ ),

26 CHAPTER 5

then (» ’ B)’1 = (» ’ A)’1 for » > 0. But these are the Laplace trans-

forms of the semigroups Qt and P t , and by the uniqueness theorem for

Laplace transforms, Qt = P t . QED.

Theorem 5.1 follows from theorems 5.3, 5.4, 5.5 and the well-known

formula for the fundamental solution of the di¬usion equation.

References

[13]. G. A. Hunt, Semi-group of measures on Lie groups, Transactions of

the American Mathematical Society 81 (1956), 264“293.

(Hunt treats non-local processes as well, on arbitrary Lie groups.)

Banach spaces, the principle of uniform boundedness, the Banach-

Steinhaus theorem, semigroups and in¬nitesimal generators are all dis-

cussed in detail in:

[14]. Einar Hille and Ralph S. Phillips, “Functional Analysis and Semi-

Groups”, revised edition, American Math. Soc. Colloquium Publications,

vol. XXXI, 1957.

Chapter 6

Gaussian processes

Gaussian random variables were discussed by Gauss in 1809 and the

central limit theorem was stated by Laplace in 1812. Laplace had already

considered Gaussian random variables around 1780, and for this reason

Frenchmen call Gaussian random variables “Laplacian”. However, the

Gaussian measure and an important special case of the central limit the-

orem were discovered by de Moivre in 1733. The main tool in de Moivre™s

work was Stirling™s formula, which, except for the fact that the constant

√

occurring in it is 2π, was discovered by de Moivre. In statistical me-

chanics the Gaussian distribution is called “Maxwellian”. Another name

for it is “normal”.

A Gaussian measure on ‚ is a measure that is the transform of the

measure with density

1 1 2

e’ 2 |x|

(2π) /2

under an a¬ne transformation. It is called singular in case the a¬ne

transformation is singular, which is the case if and only if it is singular

with respect to Lebesgue measure.

A set of random variables is called Gaussian in case the distribution

of each ¬nite subset is Gaussian. A set of linear combinations, or limits in

measure of linear combinations, of Gaussian random variables is Gaussian.

Two (jointly) Gaussian random variables are independent if and only if

they are uncorrelated; i.e., their covariance

r(x, y) = E(x ’ Ex)(y ’ Ey)

is zero (where E denotes the expectation).

27

28 CHAPTER 6

We de¬ne the mean m and covariance r of a probability measure µ

on ‚ as follows, provided the integrals exist:

mi = xi µ(dx)

xi xj µ(dx) ’ mi mj = (xi ’ mi )(xj ’ mj )µ(dx)

rij =

where x has the components xi . The covariance matrix r is of positive

type. Let µ be a probability measure on ‚ , µ its inverse Fourier transform

ˆ

eiξ·x µ(dx).

µ(ξ) =

ˆ

Then µ is Gaussian if and only if

1

µ(ξ) = e’ 2 rij ξi ξj +i mi ξi

ˆ

in which case r is the covariance and m the mean. If r is nonsingular and

r’1 denotes the inverse matrix, then the Gaussian measure with mean m

and covariance r has the density

1 1

(r ’1 )ij (xi ’mi )(xj ’mj )

e’ 2 .

1

(2π) /2 (det r) 2

If r is of positive type there is a unique Gaussian measure with covari-

ance r and mean m.

A set of complex random variables is called Gaussian if and only if the

real and imaginary parts are (jointly) Gaussian. We de¬ne the covariance

of complex random variables by

r(x, y) = E(¯ ’ E¯)(y ’ Ey).

x x

Let T be a set. A complex function r on T —T is called of positive type

in case for all t1 , . . . , t in T the matrix r(ti , tj ) is of positive type. Let x

be a stochastic process indexed by T . We call r(t, s) = r x(t), x(s) the

covariance of the process, m(t) = Ex(t) the mean of the process (provided

the integrals exist). The covariance is of positive type.

The following theorem is immediate (given the basic existence theo-

rem for stochastic processes with prescribed ¬nite joint distributions).

THEOREM 6.1 Let T be a set, m a function on T , r a function of

positive type on T —T . Then there is a Gaussian stochastic process indexed

by T with mean m and covariance r. Any two such are equivalent.

GAUSSIAN PROCESSES 29

Reference

[15]. J. L. Doob, “Stochastic Processes”, John Wiley & Sons, Inc., New

York, 1953. (Gaussian processes are discussed on pp. 71“78.)

Chapter 7

The Wiener integral

The di¬erences of the Wiener process

w(t) ’ w(s), 0¤s¤t<∞

form a Gaussian stochastic process, indexed by pairs of positive numbers

s and t with s ¤ t. This di¬erence process has mean 0 and covariance

E w(t) ’ w(s) w(t ) ’ w(s ) = σ 2 |[s, t] © [s , t ]|

where | | denotes Lebesgue measure, and σ 2 is the variance parameter of

the Wiener process.

We can extend the di¬erence process to all pairs of real numbers s

and t. We can arbitrarily assign a distribution to w(0). The resulting

stochastic process w(t), ’∞ < t < ∞, is called the two-sided Wiener

process. It is Gaussian if and only if w(0) is Gaussian (e.g., w(0) = x0

where x0 is a ¬xed point), but in any case the di¬erences are Gaussian.

If we know that a Brownian particle is at x0 at the present moment,

w(0) = x0 , then w(t) for t > 0 is the position of the particle at time t in

the future and w(t) for t < 0 is the position of the particle at time t in

the past. A movie of Brownian motion looks, statistically, the same if it

is run backwards.

We recall that, with probability one, the sample paths of the Wiener

process are continuous but not di¬erentiable. Nevertheless, integrals of

the form

∞

f (t) dw(t)

’∞