<<

. 8
( 19 .)



>>

Let Sn be a simple symmetric random walk. This means that Yk = Sk ’ Sk’1
equals +1 with probability 1 , equals ’1 with probability 1 , and is independent of Yj for
2 2
n
2 2
j < k. We notice that E Sn = 0 while E Sn = i=1 E Yi + i=j E Yi Yj = n using the fact
that E [Yi Yj ] = (E Yi )(E Yj ) = 0.

n
De¬ne Xt = Snt / n if nt is an integer and by linear interpolation for other t. If
nt is an integer, E Xt = 0 and E (Xt )2 = t. It turns out Xt does not converge for any ω.
n n n

However there is another kind of convergence, called weak convergence, that takes
place. There exists a process Zt such that for each k, each t1 < t2 < · · · < tk , and each
a1 < b1 , a2 < b2 , . . . , ak < bk , we have
(1) The paths of Zt are continuous as a function of t.
n n
(2) P(Xt1 ∈ [a1 , b1 ], . . . , Xtk ∈ [ak , bk ]) ’ P(Zt1 ∈ [a1 , b1 ], . . . , Ztk ∈ [ak , bk ]).
See Note 1 for more discussion of weak convergence.
The limit Zt is called a Brownian motion starting at 0. It has the following prop-
erties.
(1) E Zt = 0.
2
(2) E Zt = t.
(3) Zt ’ Zs is independent of Fs = σ(Zr , r ¤ s).
(4) Zt ’ Zs has the distribution of a normal random variable with mean 0 and variance
t ’ s. This means
b
1 2
e’y /2(t’s)
P(Zt ’ Zs ∈ [a, b]) = dy.
2π(t ’ s)
a

(This result follows from the central limit theorem.)
(5) The map t ’ Zt (ω) is continuous for almost all ω.
See Note 2 for a few remarks on this de¬nition.
It is common to use Bt (“B” for Brownian) or Wt (“W” for Wiener, who was the
¬rst person to prove rigorously that Brownian motion exists). We will most often use Wt .

46
We will use Brownian motion extensively and develop some of its properties. As
one might imagine for a limit of a simple random walk, the paths of Brownian motion have
a huge number of oscillations. It turns out that the function t ’ Wt (ω) is continuous, but
it is not di¬erentiable; in fact one cannot de¬ne a derivative at any value of t. Another
bizarre property: if one looks at the set of times at which Wt (ω) is equal to 0, this is a
set which is uncountable, but contains no intervals. There is nothing special about 0 “ the
same is true for the set of times at which Wt (ω) is equal to a for any level a.

In what follows, one of the crucial properties of a Brownian motion is that it is a
martingale with continuous paths. Let us prove this.

Proposition 11.1. Wt is a martingale with respect to Ft and Wt has continuous paths.

Proof. As part of the de¬nition of a Brownian motion, Wt has continuous paths. Wt is Ft
measurable by the de¬nition of Ft . Since the distribution of Wt is that of a normal random
variable with mean 0 and variance t, then E |Wt | < ∞ for all t. (In fact, E |Wt |n < ∞ for
all n.)
The key property is to show E [Wt | Fs ] = Ws .

E [Wt | Fs ] = E [Wt ’ Ws | Fs ] + E [Ws | Fs ] = E [Wt ’ Ws ] + Ws = Ws .

We used here the facts that Wt ’ Ws is independent of Fs and that E [Wt ’ Ws ] = 0
because Wt and Ws have mean 0.

We will also need

Proposition 11.2. Wt2 ’ t is a martingale with continuous paths with respect to Ft .

Proof. That Wt2 ’ t is integrable and is Ft measurable is as in the above proof. We
calculate

E [Wt2 ’ t | Fs ] = E [((Wt ’ Ws ) + Ws )2 | Fs ] ’ t
= E [(Wt ’ Ws )2 | Fs ] + 2E [(Wt ’ Ws )Ws | Fs ] + E [Ws | Fs ] ’ t
2


= E [(Wt ’ Ws )2 ] + 2Ws E [Wt ’ Ws | Fs ] + Ws ’ t.
2



We used the facts that Ws is Fs measurable and that (Wt ’ Ws )2 is independent of Fs
because Wt ’ Ws is. The second term on the last line is equal to Ws E [Wt ’ Ws ] = 0. The
¬rst term, because Wt ’ Ws is normal with mean 0 and variance t ’ s, is equal to t ’ s.
Substituting, the last line is equal to

2 2
(t ’ s) + 0 + Ws ’ t = Ws ’ s

47
as required.


Note 1. A sequence of random variables Xn converges weakly to X if P(a < Xn < b) ’
P(a < X < b) for all a, b ∈ [’∞, ∞] such that P(X = a) = P(X = b) = 0. a and b can be
in¬nite. If Xn converges to a normal random variable, then P(X = a) = P(X = b) = 0 for all
a and b. This is the type of convergence that takes place in the central limit theorem. It will
not be true in general that Xn converges to X almost surely.
n m
For a sequence of random vectors (X1 , . . . , Xk ) to converge to a random vector
(X1 , . . . , Xk ), one can give an analogous de¬nition. But saying that the normalized random
walks Xn (t) above converge weakly to Zt actually says more than (2). A result from probability
theory says that Xn converges to X weakly if and only if E [f (Xn )] ’ E [f (X)] whenever f
is a bounded continuous function on R. We use this to de¬ne weak convergence for stochastic
processes. Let C([0, ∞) be the collection of all continuous functions from [0, ∞) to the reals.
This is a metric space, so the notion of function from C([0, ∞)) to R being continuous makes
sense. We say that the processes Xn converge weakly to the process Z, and mean by this
that E [F (Xn )] ’ E [F (Z)] whenever F is a bounded continuous function on C([0, ∞)). One
example of such a function F would be F (f ) = sup0¤t<∞ |f (t)| if f ∈ C([0, ∞)); Another
1
would be F (f ) = 0 f (t)dt.
The reason one wants to show that Xn converges weakly to Z instead of just showing
(2) is that weak convergence can be shown to imply that Z has continuous paths.

Note 2. First of all, there is some redundancy in the de¬nition: one can show that parts
of the de¬nition are implied by the remaining parts, but we won™t worry about this. Second,
we actually want to let Ft to be the completion of σ(Zs : s ¤ t), that is, we throw in all the
null sets into each Ft . One can prove that the resulting Ft are right continuous, and hence
the ¬ltration Ft satis¬es the “usual” conditions. Finally, the “almost all” in (5) means that
t ’ Zt (ω) is continuous for all ω, except for a set of ω of probability zero.




48
12. Stochastic integrals.
t
If one wants to consider the (deterministic) integral 0 f (s) dg(s), where f and g
are continuous and g is continuously di¬erentiable, we can de¬ne it analogously to the
n
usual Riemann integral as the limit of Riemann sums i=1 f (si )[g(si ) ’ g(si’1 )], where
s1 < s2 < · · · < sn is a partition of [0, t]. This is known as the Riemann-Stieltjes integral.
One can show (using the mean value theorem, for example) that
t t
f (s) dg(s) = f (s)g (s) ds.
0 0

If we were to take f (s) = 1[0,a] (s) (which is not continuous, but that is a minor matter
here), one would expect the following:
t t a
g (s) ds = g(a) ’ g(0).
1[0,a] (s) dg(s) = 1[0,a] (s)g (s) ds =
0 0 0

Note that although we use the fact that g is di¬erentiable in the intermediate stages, the
¬rst and last terms make sense for any g.
We now want to replace g by a Brownian path and f by a random integrand. The
expression f (s) dW (s) does not make sense as a Riemann-Stieltjes integral because it is
a fact that W (s) is not di¬erentiable as a function of t. We need to de¬ne the expression
by some other means. We will show that it can be de¬ned as the limit in L2 of Riemann
sums. The resulting integral is called a stochastic integral.
Let us consider a very special case ¬rst. Suppose f is continuous and deterministic
(i.e., does not depend on ω). Suppose we take a Riemann sum approximation
2n ’1
f ( 2in )[W ( i+1 ) ’ W ( 2in )].
In = 2n
i=0

Since Wt has zero expectation for each t, E In = 0. Let us calculate the second moment:
2
2
f ( 2in )[W ( i+1 ) ’ W ( 2in )]
E In = E (12.1)
2n
i
2n ’1
f ( 2in )2 [W ( i+1 ) ’ W ( 2in )]2
=E 2n
i=0

f ( 2in )f ( 2jn )[W ( i+1 ) ’ W ( 2in )] [W ( j+1 ) ’ W ( 2jn )].
+E 2n 2n
i=j

The ¬rst sum is bounded by
1
1
f ( 2in )2 n f (t)2 dt,

2 0
i


49
since the second moment of W ( i+1 ) ’ W ( 2in ) is 1/2n . Using the independence and the
2n
fact that Wt has mean zero,

E [W ( i+1 ’ W ( 2in )] [W ( j+1 ’ W ( 2jn )] = E [W ( i+1 ’ W ( 2in )]E [W ( j+1 ’ W ( 2jn )] = 0,
2n 2n 2n 2n

and so the second sum on the right hand side of (12.1) is zero. This calculation is the key
to the stochastic integral.
We now turn to the construction. Let Wt be a Brownian motion. We will only
consider integrands Hs such that Hs is Fs measurable for each s (see Note 1). We will
t
construct 0 Hs dWs for all H with
t
2
Hs ds < ∞. (12.2)
E
0


Before we proceed we will need to de¬ne the quadratic variation of a continuous
martingale. We will use the following theorem without proof because in our applications we
can construct the desired increasing process directly. We often say a process is a continuous
process if its paths are continuous, and similarly a continuous martingale is a martingale
with continuous paths.
Theorem 12.1. Suppose Mt is a continuous martingale such that E Mt2 < ∞ for all t.
There exists one and only one increasing process At that is adapted to Ft , has continuous
paths, and A0 = 0 such that Mt2 ’ At is a martingale.
The simplest example of such a martingale is Brownian motion. If Wt is a Brownian
motion, we saw in Proposition 11.2 that Wt2 ’ t is a martingale. So in this case At = t
almost surely, for all t. Hence W t = t.
We use the notation M t for the increasing process given in Theorem 12.1 and call
it the quadratic variation process of M . We will see later that in the case of stochastic
integrals, where
t
Nt = Hs dWs ,
0
t 2
it turns out that N = Hs ds.
t 0

We will use the following frequently, and in fact, these are the only two properties
of Brownian motion that play a signi¬cant role in the construction.
Lemma 12.1. (a) E [Wb ’ Wa | Fa ] = 0.
(b) E [Wb ’ Wa | Fa ] = E [(Wb ’ Wa )2 | Fa ] = b ’ a.
2 2



Proof. (a) This is E [Wb ’ Wa ] = 0 by the independence of Wb ’ Wa from Fa and the
fact that Wb and Wa have mean zero.

50
(b) (Wb ’ Wa )2 is independent of Fa , so the conditional expectation is the same as
E [(Wb ’ Wa )2 ]. Since Wb ’ Wa is a N (0, b ’ a), the second equality in (b) follows.
To prove the ¬rst equality in (b), we write

E [Wb ’ Wa | Fa ] = E [((Wb ’ Wa ) + Wa )2 | Fa ] ’ E [Wa | Fa ]
2 2 2


= E [(Wb ’ Wa )2 | Fa ] + 2E [Wa (Wb ’ Wa ) | Fa ] + E [Wa | Fa ]
2

2
’ E [Wa | Fa ]
= E [(Wb ’ Wa )2 | Fa ] + 2Wa E [Wb ’ Wa | Fa ],

and the ¬rst equality follows by applying (a).


We construct the stochastic integral in three steps. We say an integrand Hs = Hs (ω)
is elementary if
Hs (ω) = G(ω)1(a,b] (s)

where 0 ¤ a < b and G is bounded and Fa measurable. We say H is simple if it is a ¬nite
linear combination of elementary processes, that is,
n
Hs (ω) = Gi (ω)1(ai ,bi ] (s). (12.3)
i=1

We ¬rst construct the stochastic integral for H elementary; the work here is showing the
stochastic integral is a martingale. We next construct the integral for H simple and here
the di¬culty is calculating the second moment. Finally we consider the case of general H.

<<

. 8
( 19 .)



>>