<<

. 14
( 19 .)



>>

An example in the Markov chain setting might help. No knowledge of Markov chains
is necessary to understand this. Suppose we have a Markov chain with 3 states, A, B, and
C. Suppose we have a probability P and three di¬erent Markov chains. The ¬rst, called
A A A
Xn , represents the position at time n for the chain started at A. So X0 = A, and X1 can
A B
be one of A, B, C, X2 can be one of A, B, C, and so on. Similarly we have Xn , the chain
C
started at B, and Xn . De¬ne „¦ = {(AAA), (AAB), (ABA), . . . , (BAA), (BAB), . . .}.
So „¦ denotes the possible sequence of states for time n = 0, 1, 2. If ω = ABA, set
Y0 (ω) = A, Y1 (ω) = B, Y2 (ω) = A, and similarly for all the other 26 values of ω. De¬ne
PA (AAA) = P(X0 = A, X1 = A, X2 = A). Similarly de¬ne PA (AAB), . . .. De¬ne
A A A

PB (AAA) = P(X0 = A, X1 = A, X2 = A) (this will be 0 because we know X0 = B),
B B B B

and similarly for the other values of ω. We also de¬ne PC . So we now have one process,
Yn , and three probabilities PA , PB , PC . As you can see, there really isn™t all that much
going on here.
Here is another formulation of the Markov property.
Proposition 17.2. If s < t and f is bounded or nonnegative, then

E x [f (Xt ) | Fs ] = E Xs [f (Xt’s )], a.s.

The right hand side is to be interpreted as follows. De¬ne •(x) = E x f (Xt’s ). Then
E Xs f (Xt’s ) means •(Xs (ω)). One often writes Pt f (x) for E x f (Xt ). We prove this in
Note 3.
This formula generalizes: If s < t < u, then

E x [f (Xt )g(Xu ) | Fs ] = E Xs [f (Xt’s )g(Xu’s )],

76
and so on for functions of X at more times.

Using Proposition 17.1, the statement and proof of Proposition 17.2 can be extended
to stopping times.

Proposition 17.3. If T is a bounded stopping time, then

E x [f (XT +t ) | FT ] = E XT [f (Xt )].


We can also establish the Markov property and strong Markov property in the
x
context of solutions of stochastic di¬erential equations. If we let Xt denote the solution
to
t t
x x x
Xt =x+ σ(Xs )dWs + b(Xs )ds,
0 0
x
so that Xt is the solution of the SDE started at x, we can de¬ne new probabilities by

Px (Xt1 ∈ A1 , . . . , Xtn ∈ An ) = P(Xt1 ∈ A1 , . . . , Xtn ∈ An ).
x x



This is similar to what we did in de¬ning Px for Brownian motion, but here we do not
have translation invariance. One can show that when there is uniqueness for the solution
to the SDE, the family (Px , Xt ) satis¬es the Markov and strong Markov property. The
statement is precisely the same as the statement of Proposition 17.3.

Note 1. We want to show GN = FN . Since GN is the smallest σ-¬eld with respect to which
XN is measurable for all adapted sequences Xk and it is easy to see that FN is a σ-¬eld, to
show GN ‚ FN , it su¬ces to show that XN is measurable with respect to FN whenever Xk
is adapted. Therefore we need to show that for such a sequence Xk and any real number a,
the event (XN > a) ∈ FN .
Now (XN > a) © (N = j) = (Xj > a) © (N = j). The event (Xj > a) ∈ Fj
since X is an adapted sequence. Since N is a stopping time, then (N ¤ j) ∈ Fj and
(N ¤ j ’ 1)c ∈ Fj’1 ‚ Fj , and so the event (N = j) = (N ¤ j) © (N ¤ j ’ 1)c is in Fj . If
j ¤ k, then (N = j) ∈ Fj ‚ Fk . Therefore

(XN > a) © (N ¤ k) = ∪k ((XN > a) © (N = j)) ∈ Fk ,
j=0


which proves that (XN > a) ∈ FN .
To show FN ‚ GN , we suppose that A ∈ FN . Let Xk = 1A©(N ¤k) . Since A ∈ FN ,
then A©(N ¤ k) ∈ Fk , so Xk is Fk measurable. But XN = 1A©(N ¤N ) = 1A , so A = (XN >
0) ∈ GN . We have thus shown that FN ‚ GN , and combining with the previous paragraph,
we conclude FN = GN .

77
Note 2. Let Tn be de¬ned by Tn (ω) = (k + 1)/2n if T (ω) ∈ [k/2n , (k + 1)/2n ). It is easy
to check that Tn is a stopping time. Let f be continuous and A ∈ FT . Then A ∈ FTn as
well. We have
E [f (X k +t ’ X k ); A © Tn = k/2n ]
E [f (XTn +t ’ XTn ); A] = n n
2 2

)]P(A © Tn = k/2n )
’X
= E [f (X k k
+t
2n 2n

= E f (Xt )P(A).
Let n ’ ∞, so
E [f (XT +t ’ XT ); A] = E f (Xt )P(A).
Taking limits this equation holds for all bounded f .
If we take A = „¦ and f = 1B , we see that XT +t ’ XT has the same distribution as Xt ,
which is that of a mean 0 variance t normal random variable. If we let A ∈ FT be arbitrary
and f = 1B , we see that
P(XT +t ’ XT ∈ B, A) = P(Xt ∈ B)P(A) = P(XT +t ’ XT ∈ B)P(A),
which implies that XT +t ’ XT is independent of FT .

Note 3. Before proving Proposition 17.2, recall from undergraduate analysis that every
bounded function is the limit of linear combinations of functions eiux , u ∈ R. This follows
from using the inversion formula for Fourier transforms. There are various slightly di¬erent
formulas for the Fourier transform. We use f (u) = eiux f (x) dx. If f is smooth enough and
has compact support, then one can recover f by the formula
1
e’iux f (u) du.
f (x) =

We can ¬rst approximate this improper integral by
N
1
e’iux f (u) du
2π ’N
N
by taking N larger and larger. For each N we can approximate 2π ’N e’iux f (u) du by using
1

Riemann sums. Thus we can approximate f (x) by a linear combination of terms of the form
eiuj x . Finally, bounded functions can be approximated by smooth functions with compact
support.
Proof. Let f (x) = eiux . Then
E x [eiuXt | Fs ] = eiuXs E x [eiu(Xt ’Xs ) | Fs ]
2
= eiuXs e’u (t’s)/2
.
On the other hand,
2
•(y) = E y [f (Xt’s )] = E [eiu(Wt’s +y) ] = eiuy e’u (t’s)/2
.
So •(Xs ) = E x [eiuXt | Fs ]. Using linearity and taking limits, we have the lemma for all f .


78
18. Martingale representation theorem.
In this section we want to show that every random variable that is Ft measurable
can be written as a stochastic integral of Brownian motion. In the next section we use
this to show that under the model of geometric Brownian motion the market is complete.
This means that no matter what option one comes up with, one can exactly replicate the
result (no matter what the market does) by buying and selling shares of stock.
In mathematical terms, we let Ft be the σ-¬eld generated by Ws , s ¤ t. From (16.2)
we see that Ft is also the same as the σ-¬eld generated by Ss , s ¤ t, so it doesn™t matter
which one we work with. We want to show that if V is Ft measurable, then there exists
Hs adapted such that
V = V0 + Hs dWs , (18.1)

where V0 is a constant.
Our goal is to prove

Theorem 18.1. If V is Ft measurable and E V 2 < ∞, then there exists a constant c and
t 2
an adapted integrand Hs with E 0 Hs ds < ∞ such that

t
V =c+ Hs dWs .
0


Before we prove this, let us explain why this is called a martingale representation
theorem. Suppose Ms is a martingale adapted to Fs , where the Fs are the σ-¬eld generated
by a Brownian motion. Suppose also that E Mt2 < ∞. Set V = Mt . By Theorem 18.1, we
can write
t
Mt = V = c + Hs dWs .
0

The stochastic integral is a martingale, so for r ¤ t,

t r
Mr = E [Mt | Fr ] = c + E Hs dWs | Fr = c + Hs dWs .
0 0

We already knew that stochastic integrals were martingales; what this says is the converse:
every martingale can be represented as a stochastic integral. Don™t forget that we need
E Mt2 < ∞ and Ms adapted to the σ-¬elds of a Brownian motion.
In Note 1 we show that if every martingale can be represented as a stochastic
integral, then every random variable V that is Ft measurable can, too, provided E V 2 < ∞.
There are several proofs of Theorem 18.1. Unfortunately, they are all technical. We
outline one proof here, giving details in the notes. We start with the following, proved in
Note 2.

79
Proposition 18.2. Suppose
t
n n
V = cn + Hs dWs ,
0

cn ’ c,
E |V n ’ V |2 ’ 0,
t
and for each n the process H n is adapted with E 0 (Hs )2 ds < ∞. Then there exist a
n
t 2
constant c and an adapted Hs with E 0 Hs ds < ∞ so that
t
Vt = c + Hs dWs .
0

What this proposition says is that if we can represent a sequence of random variables Vn
and Vn ’ V , then we can represent V .
Let R be the collection of random variables that can be represented as stochastic
integrals. By this we mean
t
2
R = {V : E V < ∞,V is Ft measurable,V = c + Hs dWs
0
t
2
Hs ds < ∞}.
for some adapted H with E
0

Next we show R contains a particular collection of random variables. (The proof is
in Note 3.)
Proposition 18.3. If g is bounded, the random variable g(Wt ) is in R.
An almost identical proof shows that if f is bounded, then
t
f (Wt ’ Ws ) = c + Hr dWr
s

for some c and Hr .
Proposition 18.4. If t0 ¤ t1 ¤ · · · ¤ tn ¤ t and f1 , . . . , fn are bounded functions, then
f1 (Wt1 ’ Wt0 )f2 (Wt2 ’ Wt1 ) · · · fn (Wtn ’ Wtn’1 ) is in R.
See Note 4 for the proof.
We now ¬nish the proof of Theorem 18.1. We have shown that a large class of
random variables is contained in R.
Proof of Theorem 18.1. We have shown that random variables of the form

f1 (Wt1 ’ Wt0 )f2 (Wt2 ’ Wt1 ) · · · fn (Wtn ’ Wtn’1 ) (18.2)

80
are in R. Clearly if Vi ∈ R for i = 1, . . . , m, and ai are constants, then a1 V1 + · · · am Vm is
also in R. Finally, from measure theory we know that if E V 2 < ∞ and V is Ft measurable,
we can ¬nd a sequence Vk such that E |Vk ’ V |2 ’ 0 and each Vk is a linear combination
of random variables of the form given in (18.2). Now apply Proposition 18.2.


Note 1. Suppose we know that every martingale Ms adapted to Fs with E Mt2 can be
r
represented as Mr = c+ 0 Hs dWs for some suitable H. If V is Ft measurable with E V 2 < ∞,
let Mr = E [V | Fr ]. We know this is a martingale, so
r
Mr = c + Hs dWs
0

for suitable H. Applying this with r = t,
t
V = E [V | Ft ] = Mt = c + Hs dWs .
0


Note 2. We prove Proposition 18.2. By our assumptions,

E |(V n ’ cn ) ’ (V m ’ cm )|2 ’ 0

as n, m ’ ∞. So
t 2
n m
(Hs ’ Hs )dWs ’ 0.
E
0

<<

. 14
( 19 .)



>>