<<

. 10
( 19 .)



>>

will consider will be continuous.
If one is slightly more careful, one sees that processes whose paths are functions which
are continuous from the left at each time point are also predictable. This gives an indication
of where the name comes from. If Hs has paths which are left continuous, then Ht =
limn’∞ Ht’ n , and we can “predict” the value of Ht from the values at times that come
1

before t. If Ht is only right continuous and a path has a jump at time t, this is not possible.

Note 2. Let us consider the case a < s < t < b; again similar arguments take care of the
other ¬ve cases. We need to show

E [G2 (Wt ’ Wa )2 ’ G2 (t ’ a) | Fs ] = G2 (Ws ’ Wa )2 ’ G2 (s ’ a). (12.4)

55
The left hand side is equal to G2 E [(Wt ’ Wa )2 ’ (t ’ a) | Fs ]. We write this as

G2 E [((Wt ’ Ws ) + (Ws ’ Wa ))2 ’ (t ’ a) | Fs ]
= G2 E [(Wt ’ Ws )2 | Fs ] + 2E [(Wt ’ Ws )(Ws ’ Wa ) | Fs ]

+ E [(Ws ’ Wa )2 | Fs ] ’ (t ’ a)

= G2 E [(Wt ’ Ws )2 ] + 2(Ws ’ Wa )E [Wt ’ Ws | Fs ] + (Ws ’ Wa )2 ’ (t ’ a)

= G2 (t ’ s) + 0 + (Ws ’ Wa )2 ’ (t ’ a) .

The last expression is equal to the right hand side of (12.4).
the L2 norm of
Note 3. A de¬nition from measure theory says that if µ is a measure, f 2,
1/2
2
. The space L2 is de¬ned
f with respect to the measure µ, is de¬ned as f (x) µ(dx)
to be the set of functions f for which f 2 < ∞. (A technical proviso: one has to identify as
equal functions which di¬er only on a set of measure 0.) If one de¬nes a distance between two
functions f and g by d(f, g) = f ’ g 2 , this is a metric on the space L2 , and a theorem from
measure theory says that L2 is complete with respect to this metric. Another theorem from
n
measure theory says that the collection of simple functions (functions of the form i=1 ci 1Ai )
is dense in L2 with respect to the metric.
Let us de¬ne a norm on stochastic processes; this is essentially an L2 norm. De¬ne
1/2
Nt2
N = E sup .
0¤t<∞

One can show that this is a norm, and hence that the triangle inequality holds. Moreover, the
space of processes N such that N < ∞ is complete with respect to this norm. This means
that if N n is a Cauchy sequence, i.e., if given µ there exists n0 such that N n ’ N m < µ
whenever n, m ≥ n0 , then the Cauchy sequence converges, that is, there exists N with N <
∞ such that N n ’ N ’ 0.
We can de¬ne another norm on stochastic processes. De¬ne
∞ 1/2
2
H =E Hs ds .
2
0

This can be viewed as a standard L2 norm, namely, the L2 norm with respect to the measure
µ de¬ned on P by

µ(A) = E 1A (s, ω)ds.
0

Since the set of simple functions with respect to µ is dense in L2 , this says that if H is
n
measurable with respect to P, then there exist simple processes Hs that are also measurable
with respect to P such that H n ’ H 2 ’ 0.

56
Note 4. We have N n ’ N ’ 0, where the norm here is the one described in Note 3. Each
N n is a stochastic integral of the type described in Step 2 of the construction, hence each Ntn
is a martingale. Let s < t and A ∈ Fs . Since E [Ntn | Fs ] = Ns , then
n


E [Ntn ; A] = E [Ns ; A].
n
(12.5)

By Cauchy-Schwarz,
1/2 1/2
|E [Ntn ; A] E [ |Ntn E [(Ntn 2
E [12 ]
’ E [Nt ; A]| ¤ ’ Nt |; A] ¤ ’ Nt ) ] A

¤ N n ’ N ’ 0. (12.6)

We have a similar limit when t is replaced by s, so taking the limit in (12.5) yields

E [Nt ; A] = E [Ns ; A].

Since Ns is Fs measurable and has the same expectation over sets A ∈ Fs as Nt does, then
by Proposition 4.3 E [Nt | Fs ] = Ns , or Nt is a martingale.
Suppose N n ’ N ’ 0. Given µ > 0 there exists n0 such that N n ’ N < µ if
n ≥ n0 . Take µ = 1 and choose n0 . By the triangle inequality,

N ¤ Nn + Nn ’ N ¤ Nn + 1 < ∞

since N n is ¬nite for each n.
That Nt2 ’ N t is a martingale is similar to the proof that Nt is a martingale, but
slightly more delicate. We leave the proof to the reader, but note that in place of (SEC.402)
one writes

|E [(Ntn )2 ; A] ’ E [(Nt )2 ; A]| ¤ E [ |(Ntn )2 ’ (Nt )2 |] ¤ E [ |Ntn ’ Nt | |Ntn + Nt |]. (12.7)

By Cauchy-Schwarz this is less than

Ntn ’ Nt Ntn + Nt .

since Ntn + Nt ¤ Ntn + Nt is bounded independently of n, we see that the left hand
side of (12.7) tends to 0.
Note 5. We have N n ’ N ’ 0, where the norm is described in Note 3. This means that

E [sup |Ntn ’ Nt |2 ] ’ 0.
t

A result from measure theory implies that there exists a subsequence nk such that

sup |Ntnk ’ Nt |2 ’ 0, a.s.
t


57
So except for a set of ω™s of probability 0, Ntnk (ω) converges to Nt (ω) uniformly. Each Ntnk (ω)
is continuous by Step 2, and the uniform limit of continuous functions is continuous, therefore
Nt (ω) is a continuous function of t. Incidentally, this is the primary reason we considered
Doob™s inequalities.

Note 6. If Mt is a continuous martingale, E [Mb ’ Ma | Fa ] = E [Mb | Fa ] ’ Ma =
Ma ’ Ma = 0. This is the analogue of Lemma 12.1(a). To show the analogue of Lemma
12.1(b),

E [(Mb ’ Ma )2 | Fa ] = E [Mb | Fa ] ’ 2E [Mb Ma | Fa ] + E [Ma | Fa ]
2 2

2 2
= E [Mb | Fa ] ’ 2Ma E [Mb | Fa ] + Ma
2 2
= E [Mb ’ M | Fa ] + E [ M ’M | Fa ] + Ma ’ M
b b a a

’M | Fa ],
= E[ M b a


since Mt2 ’ M is a martingale. That
t

2 2
E [Mb ’ Ma | Fa ] = E [ M ’M | Fa ]
b a


is just a rewriting of

2 2 2
E [Mb ’ M | Fa ] = Ma ’ M = E [Ma ’ M | Fa ].
b a a


With these two properties in place of Lemma 12.1, replacing Ws by Ms and ds by
t
d M s , the construction of the stochastic integral 0 Hs dMs goes through exactly as above.
t
2
Note 7. If we let TK = inf{t > 0 : 0 Hs d M s ≥ K}, the ¬rst time the integral is larger

K K
than or equal to K, and we let Hs = Hs 1(s¤TK ) , then 0 Hs d M s ¤ K and there is no
t
di¬culty de¬ning NtK = 0 Hs dMs for every t. One can show that if t ¤ TK1 and TK2 , then
K
t
NtK1 = NtK2 a.s. If 0 Hs d M s is ¬nite for every t, then TK ’ ∞ as K ’ ∞. If we call the
common value Nt , this allows one to de¬ne the stochastic integral Nt for each t in the case
t 2
where the integral 0 Hs d M s is ¬nite for every t, even if the expectation of the integral is
not.
We can do something similar is Mt is a martingale but where we do not have E M ∞ <
∞. Let SK = inf{t : |Mt | ≥ K}, the ¬rst time |Mt | is larger than or equal to K. If we let
MtK = Mt§SK , where t § Sk = min(t, SK ), then one can show M K is a martingale bounded
t
in absolute value by K. So we can de¬ne Jt = 0 Hs dMtK for every t, using the paragraph
K

above to handle the wider class of H™s, if necessary. Again, one can show that if t ¤ SK1 and
t ¤ SK2 , then the value of the stochastic integral will be the same no matter whether we use
M K1 or M K2 as our martingale. We use the common value as a de¬nition of the stochastic
integral Jt . We have SK ’ ∞ as K ’ ∞, so we have a de¬nition of Jt for each t.

58
Note 8. We only outline how the proof goes. To show
t t
Hs dNs = Hs Ks dWs , (12.8)
0 0

one shows that (SEC.801) holds for Hs simple and then takes limits. To show this, it su¬ces
to look at Hs elementary and use linearity. To show (12.8) for Hs elementary, ¬rst prove this
in the case when Ks is elementary, use linearity to extend it to the case when K is simple, and
then take limits to obtain it for arbitrary K. Thus one reduces the proof to showing (12.8)
when both H and K are elementary. In this situation, one can explicitly write out both sides
of the equation and see that they are equal.




59
13. Ito™s formula.
Suppose Wt is a Brownian motion and f : R ’ R is a C 2 function, that is, f and its
¬rst two derivatives are continuous. Ito™s formula, which is sometime known as the change
of variables formula, says that
t t
1
f (Wt ) ’ f (W0 ) = f (Ws )dWs + f (Ws )ds.
2
0 0

Compare this with the fundamental theorem of calculus:
t
f (t) ’ f (0) = f (s)ds.
0

In Ito™s formula we have a second order term to carry along.
The idea behind the proof is quite simple. By Taylor™s theorem.
n’1
f (Wt ) ’ f (W0 ) = [f (W(i+1)t/n ) ’ f (Wit/n )]
i=0
n’1
≈ f (Wit/n )(W(i+1)t/n ’ Wit/n )
i=1
n’1
f (Wit/n )(W(i+1)t/n ’ Wit/n )2 .
1
+ 2
i=0

The ¬rst sum on the right is approximately the stochastic integral and the second is
approximately the quadratic variation.
For a more general semimartingale Xt = Mt + At , Ito™s formula reads
Theorem 13.1. If f ∈ C 2 , then
t t
1
f (Xt ) ’ f (X0 ) = f (Xs )dXs + f (Xs )d M s .
2
0 0


Let us look at an example. Let Wt be Brownian motion, Xt = σWt ’ σ 2 t/2, and
f (x) = ex . Then X t = σW t = σ 2 t, f (x) = f ”(x) = ex , and
t t
σWt ’σ 2 t/2 Xs
eXs 1 σ 2 ds
1
σdWs ’
e =1+ e (13.1)
2 2
0 0
t
eXs σ 2 ds
1
+ 2
0
t
eXs σdWs .
=1+
0

60
This example will be revisited many times later on.

<<

. 10
( 19 .)



>>