1. Trang chủ
  2. » Kinh Doanh - Tiếp Thị

stochastic integration and differential equations 2ed - protter

430 568 0
Tài liệu đã được kiểm tra trùng lặp

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Tiêu đề Stochastic Integration and Differential Equations
Trường học Springer Berlin Heidelberg, New York, Hong Kong, London, Milan, Paris, Tokyo
Chuyên ngành Stochastic Integration and Differential Equations
Thể loại Textbook
Năm xuất bản 1990
Định dạng
Số trang 430
Dung lượng 18,55 MB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

1985 11 Hida, Brownian Motion 1980 12 Hestenes, Conjugate Direction Methods in Optimization 1980 13 Kallianpur, Stochastic Filtering Theory 1980 14 Krylov, Controlled Diffusion Proce

Trang 1

Stochastic Mechanics Applications of

Random Media Mathematics

signal Processing Stochastic Modelling and Image Synthesis and Applied Probability Mathematical Economics and Finance

Stochastic Optimization 21

Stochastic Control Stochastic Models in Life Sciences

Trang 2

Applications of Mathematics

I FlemingIRishel, Deterministic and Stochastic Optimal Control (1975)

2 Marchuk, Methods of Numerical Mathematics ig75,2nd ed 1982)

3 Balakrishnan, Applied Functional Analysis (1976,znd ed 1981)

4 Borovkov, Stochastic Processes in Queueing Theory (1976)

5 LiptserlShiryaev, Statistics of Random Processes 1: General Theory (1977.2nd ed 2001)

6 LiptserlShiryaev, Statistics of Random Processes 11: Applications (1978,znd ed 2001)

7 Vorob'ev, Game Theory: Lectures for Economists and Systems Scientists (1977)

8 Shiryaev, Optimal Stopping Rules (1978)

g IbragimovlRozanov, Gaussian Random Processes (1978)

lo Wonham, Linear Multivariable Control: A Geometric Approach (1979,znd ed 1985)

11 Hida, Brownian Motion (1980)

12 Hestenes, Conjugate Direction Methods in Optimization (1980)

13 Kallianpur, Stochastic Filtering Theory (1980)

14 Krylov, Controlled Diffusion Processes (1980)

15 Prabhu, Stochastic Storage Processes: Queues, Insurance Risk, and Dams (1980)

16 IbragimovlHas'minskii, Statistical Estimation: Asymptotic Theory (1981)

17 Cesari, Optimization: Theory and Applications (1982)

18 Elliott, Stochastic Calculus and Applications (1982)

lg MarchuWShaidourov, Difference Methods and Their Extrapolations (1983)

20 Hijab, Stabilization of Control Systems (1986)

21 Protter, Stochastic Integration and Differential Equations (1990,znd ed 2003)

22 Benveni~telMCtivierIPriouret, Adaptive Algorithms and Stochastic Approximations (1990)

23 KloedenlPlaten, Numerical Solution of Stochastic Differential Equations

(1992, corr 3rd printing 1999)

24 KushnerlDupuis, Numerical Methods for Stochastic Control Problems in Continuous Time (1992)

25 FlemingISoner, Controlled Markov Processes and Viscosity Solutions (1993)

26 BaccellilBrCmaud, Elements of Queueing Theory (1994,znd ed 2003)

27 Winkler, Image Analysis, Random Fields and Dynamic Monte Carlo Methods

(igg5,2nd ed 2003)

28 Kalpazidou, Cycle Representations of Markov Processes (1995)

29 ElliotffAggounlMoore, Hidden MarkovModels: Estimation and Control (1995)

30 Hernandez-LermalLasserre, Discrete-Time Markov Control Processes (1995)

31 DevroyelGyorfdLugosi, A Probabilistic Theory of Pattern Recognition (1996)

32 MaitralSudderth, Discrete Gambling and Stochastic Games (1996)

33 EmbrechtslKliippelberglMikosch, Modelling Extremal Events for Insurance and Finance (1997, corr 4th printing 2003)

34 Duflo, Random Iterative Models (1997)

35 KushnerlYin, Stochastic Approximation Algorithms and Applications (1997)

36 Musiela/Rutkowski, Martingale Methods in Financial Modelling (1997)

37 Yin, continuous-~ime ~ a r k o v chains and Applications (1998)

38 DembolZeitouni, Large Deviations Techniques and Applications (1998)

39 Karatzas, Methods of Mathematical Finance (1998)

40 Fayolle/Iasnogorodski/Malyshev, Random Walks in the Quarter-Plane (1999)

41 AvenlJensen, Stochastic Models in Reliability (1999)

42 Hernandez-LermalLasserre, Further Topics on Discrete-Tie Markov Control Processes (1999)

43 YonglZhou, Stochastic Controls Hamiltonian Systems and HJB Equations (1999)

44 Serfozo, Introduction to Stochastic Networks (1999)

45 Steele, Stochastic Calculus and Financial Applications (2001)

46 ChenlYao, Fundamentals of Queuing Networks: Performance, Asymptotics,

and Optimization (2001)

47 Kushner, Heavy Traffic Analysis of Controlled Queueing and Communications Networks (2001)

48 Fernholz, Stochastic Portfolio Theory (2002)

49 KabanovlPergamenshchikov, Two-Scale Stochastic Systems (2003)

50 Han, Information-Spectrum Methods in Information Theory (2003)

(continued after index)

Trang 3

Philip E Protter

Stochastic

Integration

and Differential Equations

Second Edition

Springer

Trang 4

Center for Applied Mathematical Universite de Paris VI

University of Southern California et Modeles Aldatoires

1042 West 36th Place, 175, rue du Chevaleret

Denney Research Building 308 75013 Paris, France

Los Angeles, CA 90089, USA

Mathematics Subject Classification (2000): PRIMARY: 60H05,60H10,60H20 SECONDARY: 60G07,60G17,60G44,60G51

Cover pattern by courtesy of Rick Durrett (Cornell University, Ithaca)

Cataloging-in-Publication Data applied for

A catalog record for this book is available from the Library of Congress

Bibliographic information published by Die Deutsche Bibliothek

Die Deutsche Bibliothek lists this publication in the Deutsche Nationalbibliografie;

detailed bibliographic data is available in the Internet at http://dnb.ddb.de

ISSN 0172-4568

ISBN 3-540-00313-4 Springer-Verlag Berlin Heidelberg New York

This work is subject to copyright All rights are reserved, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilm or in any other way, and storage in data banks Duplication of this publication or parts thereof is permitted only under the provisions

of the German Copyright Law of September 9, 1965, in its current version, and permission for use must always be obtained from Springer-Verlag Violations are liable for prosecution under the German Copyright Law

Springer-Verlag Berlin Heidelberg New York

a member of BertelsmannSpringer Science + Business Media GmbH

O Springer-Verlag Berlin Heidelberg 2004

Printed in Germany

The use of general descriptive names, registered names, trademarks, etc in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use

Cover design: Erich Kirchner Heidelberg

Typescmng by the author using a Springer TEX macro package

Printed on acid-free paper

Trang 5

To Diane and Rachel

Trang 7

Preface to the Second Edition

It has been thirteen years since the first edition was published, with its subtitle

"a new approach." While the book has had some success, there are still almost

no other books that use the same approach (See however the recent book by

K Bichteler [15].) There are nevertheless of course other extant books, many

of them quite good, although the majority still are devoted primarily t o the case of continuous sample paths, and others treat stochastic integration as one of many topics Examples of alternative texts which have appeared since the first edition of this book are: [32], [44], [87], [110], [186], [180], [208], [216], and [226] While the subject has not changed much, there have been new developments, and subjects we thought unimportant in 1990 and did not include, we now think important enough either to include or t o expand in this book

The most obvious changes in this edition are that we have added exercises

a t the end of each chapter, and we have also added Chap VI which intro- duces the expansion of filtrations However we have also completely rewritten Chap 111 In the first edition we followed an elementary approach which was

P A Meyer's original approach before the methods of DolBans-Dade In or- der to remain friends with Freddy Delbaen, and also because we now agree with him, we have instead used the modern approach of predictability rather than naturality However we benefited from the new proof of the Doob-Meyer Theorem due to R Bass, which ultimately uses only Doob's quadratic martin- gale inequality, and in passing reveals the role played by totally inaccessible stopping times The treatment of Girsanov's theorem now includes the case where the two probability measures are not necessarily equivalent, and we include the Kazamaki-Novikov theorems We have also added a section on compensators, with examples In Chap IV we have expanded our treatment

of martingale representation to include the Jacod-Yor Theorem, and this has allowed us to use the Emery-AzBma martingales as a class of examples of mar- tingales with the martingale representation property Also, largely because of the Delbaen-Schachermayer theory of the fundamental theorems of mathe- matical finance, we have included the topic of sigma martingales In Chap V

Trang 8

VIII Preface to the Second Edition

we added a section which includes some useful results about the solutions of stochastic differential equations, inspired by the review of the first edition by

E Pardoux [191] We have also made small changes throughout the book; for instance we have included specific examples of L6vy processes and their corresponding L6vy measures, in Sect 4 of Chap I

The exercises are gathered at the end of the chapters, in no particular order Some of the (presumed) harder problems we have designated with a star (*), and occasionally we have used two stars (**) While of course many

of the problems are of our own creation, a significant number are theorems

or lemmas taken from research papers, or taken from other books We do not attempt t o ascribe credit, other than listing the sources in the bibliography, primarily because they have been gathered over the past decade and often

we don't remember from where they came We have tried systematically t o refrain from relegating a needed lemma as an exercise; thus in that sense the exercises are independent from the text, and (we hope) serve primarily t o illustrate the concepts and possible applications of the theorems

Last, we have the pleasant task of thanking the numerous people who helped with this book, either by suggesting improvements, finding typos and mistakes, alerting me t o references, or by reading chapters and making com- ments We wish to thank patient students both a t Purdue University and Cornell University who have been subjected to preliminary versions over the years, and the following individuals: C Benei, R Cont, F Diener, M Di- ener, R Durrett, T Fujiwara, K Giesecke, L Goldberg, R Haboush, J Ja- cod, H Kraft, K Lee, J Ma, J Mitro, J Rodriguez, K Schiirger, D Sezer,

J A Trujillo Ferreras, R Williams, M Yor, and Yong Zeng Th Jeulin,

K Shimbo, and Yan Zeng gave extraordinary help, and my editor C Byrne gives advice and has patience that is impressive Over the last decade I have learned much from many discussions with Darrell Duffie, Jean Jacod, Tom Kurtz, and Denis Talay, and this no doubt is reflected in this new edition Finally, I wish to give a special thanks t o M Kozdron who hastened the ap- pearance of this book through his superb help with B W , as well as his own advice on all aspects of the book

Ithaca, NY

August 2003

Philip Protter

Trang 9

Preface to the First Edition

The idea of this book began with an invitation to give a course a t the Third Chilean Winter School in Probability and Statistics, a t Santiago de Chile, in July, 1984 Faced with the problem of teaching stochastic integration in only

a few weeks, I realized that the work of C Dellacherie [42] provided an outline for just such a pedagogic approach I developed this into a series of lectures (Protter [201]), using the work of K Bichteler [14], E Lenglart [145] and

P Protter [202], as well as that of Dellacherie I then taught from these lecture notes, expanding and improving them, in courses a t Purdue University, the University of Wisconsin a t Madison, and the University of Rouen in France

I take this opportunity to thank these institutions and Professor Rolando Rebolledo for my initial invitation to Chile

This book assumes the reader has some knowledge of the theory of stochas- tic processes, including elementary martingale theory While we have recalled the few necessary martingale theorems in Chap I, we have not provided proofs, as there are already many excellent treatments of martingale the- ory readily available (e.g., Breiman [23], Dellacherie-Meyer [45, 461, or Ethier- Kurtz [71]) There are several other texts on stochastic integration, all of which adopt t o some extent the usual approach and thus require the general theory The books of Elliott [63], Kopp [130], MQtivier [158], Rogers-Williams [210] and t o a much lesser extent Letta [148] are examples The books of McK- ean [153], Chung-Williams [32], and Karatzas-Shreve [121] avoid the general theory by limiting their scope to Brownian motion (McKean) and t o contin- uous semimartingales

Our hope is that this book will allow a rapid introduction to some of the deepest theorems of the subject, without first having t o be burdened with the beautiful but highly technical "general theory of processes."

Many people have aided in the writing of this book, either through dis- cussions or by reading one of the versions of the manuscript I would like to thank J Azema, M Barlow, A Bose, M Brown, C Constantini, C Dellache- rie, D Duffie, M Emery, N Falkner, E Goggin, D Gottlieb, A Gut, S He,

J Jacod, T Kurtz, J de Sam Lazaro, R Leandre, E Lenglart, G Letta,

Trang 10

X Preface to the First Edition

S Levantal, P A Meyer, E Pardoux, H Rubin, T Sellke, R Stockbridge,

C Stricker, P Sundar, and M Yor I would especially like t o thank J San Mar- tin for his careful reading of the manuscript in several of its versions

Svante Janson read the entire manuscript in several versions, giving me support, encouragement, and wonderful suggestions, all of which improved the book He also found, and helped t o correct, several errors I am extremely grateful to him, especially for his enthusiasm and generosity

The National Science Foundation provided partial support throughout the writing of this book

I wish to thank Judy Snider for her cheerful and excellent typing of several versions of this book

Philip Protter

Trang 11

3 Elementary Examples of Semimartingales 54

Trang 12

XI1 Contents

5 Compensators 118

6 The Fundamental Theorem of Local Martingales 124

7 Classical Semimartingales 127

8 Girsanov's Theorem 131 9 The Bichteler-Dellacherie Theorem 143

Bibliographic Notes 147

Exercises for Chapter I11 147

IV General Stochastic Integration and Local Times 153

1 Introduction 153

2 Stochastic Integration for Predictable Integrands 153

3 Martingale Representation 178

4 Martingale Duality and the Jacod-Yor Theorem on Martingale Representation 193

5 Examples of Martingale Representation 200

6 Stochastic Integration Depending on a Parameter 205

7 LocalTimes 210

8 AzBma's Martingale 227

9 Sigma Martingales 233

Bibliographic Notes 235

Exercises for Chapter IV 236

V Stochastic Differential Equations 243

1 Introduction 243

2 The BP Norms for Semimartingales 244

3 ~ x i s t e n c e and Uniqueness of Solutions 249

4 Stability of Stochastic Differential Equations 257

5 Fisk-Stratonovich Integrals and Differential Equations 270

6 The Markov Nature of Solutions 291

7 Flows of Stochastic Differential Equations: Continuity and Differentiability 301

8 Flows as Diffeomorphisms: The Continuous Case 310

9 General Stochastic Exponentials and Linear Equations 321

10 Flows as Diffeomorphisms: The General Case 328

11 Eclectic Useful Results on Stochastic Differential Equations 338

Bibliographic Notes 347

Exercises for Chapter V 349

VI Expansion of Filtrations 355 1 Introduction 355

2 Initial Expansions 356

3 Progressive Expansions 369

4 TimeReversal 377

Bibliographic Notes 383

Exercises for Chapter VI 384

Trang 13

Contents XI11 References 389

Subject Index 407

Trang 15

Introduction

In this book we present a new approach to the theory of modern stochastic integration The novelty is that we define a semimartingale as a stochastic pro- cess which is a "good integrator" on an elementary class of processes, rather than as a process that can be written as the sum of a local martingale and an adapted process with paths of finite variation on compacts: This approach has the advantage over the customary approach of not requiring a close analysis of the structure of martingales as a prerequisite This is a significant advantage because such an analysis of martingales itself requires a highly technical body

of knowledge known as "the general theory of processes." Our approach has a further advantage of giving traditionally difficult and non-intuitive theorems (such as Stricker's Theorem) transparently simple proofs We have tried to capitalize on the natural advantage of our approach by systematically choos- ing the simplest, least technical proofs and presentations As an example we have used K M Rao's proofs of the Doob-Meyer decomposition theorems

in Chap 111, rather than the more abstract but less intuitive DolBans-Dade measure approach

In Chap I we present preliminaries, including the Poisson process, Brown- ian motion, and LBvy processes Naturally our treatment presents those prop- erties of these processes that are germane to stochastic integration

In Chap I1 we define a semimartingale as a good integrator and establish many of its properties and give examples By restricting the class of integrands

to adapted processes having left continuous paths with right limits, we are able to give an intuitive Riemann-type definition of the stochastic integral as the limit of sums This is sufficient to prove many theorems (and treat many applications) including a change of variables formula ("ItG's formula") Chapter I11 is devoted to developing a minimal amount of "general the- ory" in order to prove the Bichteler-Dellacherie Theorem, which shows that our "good integrator" definition of a semimartingale is equivalent to the usual one as a process X having a decomposition X = M + A, into the sum of a local martingale M and an adapted process A having paths of finite variation

on compacts Nevertheless most of the theorems covered en route (Doob-

Trang 16

2 Introduction

Meyer, Meyer-Girsanov) are themselves key results in the theory The core

of the whole treatment is the Doob-Meyer decomposition theorem We have followed the relatively recent proof due to R Bass, which is especially simple for the case where the martingale jumps only at totally inaccessible stopping times, and in all cases uses no mathematical tool deeper than Doob's quadratic martingale inequality This allows us to avoid the detailed treatment of nat- ural processes which was ubiquitous in the first edition, although we still use natural processes from time to time, as they do simplify some proofs

Using the results of Chap I11 we extend the stochastic integral by continu- ity to predictable integrands in Chap IV, thus making the stochastic integral

a Lebesguctype integral We use predictable integrands to develop a theory of martingale representation The theory we develop is an L2 theory, but we also prove that the dual of the martingale space 3-1' is BMO and then prove the Jacod-Yor Theorem on martingale representation, which in turn allows us to present a class of examples having both jumps and martingale representation

We also use predictable integrands to give a presentation of semimartingale local times

Chapter V serves as an introduction to the enormous subject of stochastic differential equations We present theorems on the existence and uniqueness

of solutions as well as stability results Fisk-Stratonovich equations are pre- sented, as well as the Markov nature of the solutions when the differentials have Markov-type properties The last part of the chapter is an introduction

to the theory of flows, followed by moment estimates on the solutions, and other minor but useful results Throughout Chap V we have tried to achieve

a balance between maximum generality and the simplicity of the proofs Chapter VI provides an introduction t o the theory of the expansion of fil- trations (known as "grossissements de filtrations" in the French literature) We present first a theory of initial expansions, which includes Jacod's Theorem Jacod's Theorem gives a sufficient condition for semimartingales to remain semimartingales in the expanded filtration We next present the more diffi- cult theory of progressive expansion, which involves expanding filtrations to turn a random time into a stopping time, and then analyzing what happens

to the semimartingales of the first filtration when considered in the expanded filtration Last, we give an application of these ideas to time reversal

Trang 17

Preliminaries

1 Basic Definitions and Notation

We assume as given a complete probability space ( 0 , F, P) In addition we are given a filtration (Ft)o<t<, - - By a filtration we mean a family of a-algebras (Ft)oltl, that is increasing, i.e., Fs c Ft if s I t For convenience, we will usually write IF for the filtration (Ft)o<t<, - -

Definition A filtered complete probability space ( R , F , IF, P ) is said to sat-

isfy the usual hypotheses if

(i) Fo contains all the P-null sets of 6

(ii) Ft = nu,, 3;1, all t, 0 5 t < co; that is, the filtration IF is right continuous

W e always assume that the usual hypotheses hold

Definition A random variable T : R -+ [0, co] is a stopping time if the

event {T 5 t) E Ft7 every t, 0 5 t 5 co

One important consequence of the right continuity of the filtration is the following theorem

Theorem 1 The event {T < t) E F t , 0 < t < co, if and only if T is a stopping time

Proof Since {T I t) = n,+,,,,,{T < u), any E > 0, we have { T <

t ) E nu,, Fu = Ftl so T is a stopping time For the converse, {T < t) = Ut,E,O{T < t - E), and {T 5 t - E) E FtTt ,) hence also in F t

A stochastic process X on ( 0 , F , P ) is a collection of R-valued or Rd- valued random variables (Xt)o<t<, The process X is said to be adapted if

Xt E Ft (that is, is Ft measurable) for each t We must take care t o be precise about the concept of equality of two stochastic processes

Definition Two stochastic processes X and Y are modifications if Xt = &

a s , each t Two processes X and Y are indistinguishable if a s , for all t ,

x, = Y,

Trang 18

however, then there exists one null set N such that if w 4 N, then Xt(w) =

& (w), for all t In other words, the functions t H Xt(w) and t H & (w) are the same for all w $ N , where P ( N ) = 0 The set N is in F t , all t , since 6

contains all the P-null sets of F The functions t H Xt(w) mapping [0, co) into R are called the sample paths of the stochastic process X

Definition A stochastic process X is said to be chdlhg if it a.s has sam-

ple paths which are right continuous, with left limits Similarly, a stochastic

process X is said to be chglhd if it a.s has sample paths which are left

continuous, with right limits (The nonsensical words chdlhg and chghd are

acronyms from the French for continu h droite, limites a gauche and continu

h gauche, limites a droite, respectively.)

Theorem 2 Let X and Y be two stochastic processes, with X a modifica- tion of Y If X and Y have right continuous paths a.s., then X and Y are indistinguishable

Proof Let A be the null set where the paths of X are not right continuous, and let B be the analogous set for Y Let Nt = {w : Xt(w) f &(w)), and let N = UtEq Nt, where Q denotes the rationals in [0, co) Then P ( N ) = 0 Let M = A U B U N , and P ( M ) = 0 We have Xt(w) = &(w) for all t E Q,

w $ M If t is not rational, let t, decrease to t through Q For w $ M I Xtn (w) = Kn (w), each n , and Xt(w) = limn,, Xtn (w) = limn,, &, (w) =

&(w) Since P ( M ) = 0, X and Y are indistinguishable

Corollary Let X and Y be two stochastic processes which are c&dl&g If X

is a modification of Y, then X and Y are indistinguishable

CBdlBg processes provide natural examples of stopping times

Definition Let X be a stochastic process and let A be a Bore1 set in R Define

T(w) = inf{t > 0 : Xt E A)

Then T is called a hitting time of A for X

Theorem 3 Let X be a n adapted chdlhg stochastic process, and let A be a n open set T h e n the hitting time of A is a stopping time

Proof By Theorem 1 it suffices to show that {T < t ) E Ft, 0 < t < co But

since A is open and X has right continuous paths Since {X, E A) = X;'(A) E

F,, the result follows

Trang 19

1 Basic Definitions and Notation 5

T h e o r e m 4 Let X be an adapted cadlhg stochastic process, and let A be a closed set T h e n the random variable

T(w) = infit > 0 : Xt(w) E A or Xt- (w) E A )

i s a stopping time

Proof By Xt-(w) we mean lims,t,s<t X,(W) Let A, = {x : d(x, A) < l l n ) ,

where d(x, A) denotes the distance from a point x to A Then A, is an open set and

It is a very deep result that the hitting time of a Bore1 set is a stopping

time We do not have need of this result

The next theorem collects elementary facts about stopping times; we leave the proof to the reader

T h e o r e m 5 Let S, T be stopping times Then the following are stopping times:

notion of events that are observable before a random time

Definition Let T be a stopping time The stopping t i m e a-algebra FT

FT = ~ { X T ; X all adapted chdlhg processes)

Proof Let = a{XT; X all adapted c&dl&g processes} Let A E F T Then

Xt = ~ - ~ is a c&dl&g l process, and XT = ~ ~ >In Hence ~A E 6, ~and FT c 6

1, w E A,

1 ~ is the indicator function of A : l ~ ( w ) =

0, w A q!

Trang 20

t)) into (R, a ) , where I3 are the Bore1 sets of R Therefore

is in Ft7 and this implies XT E FT Therefore 6 c FT

We leave it to the reader to check that if S 5 T as., then Fs c FT, and the less obvious (and less important) fact that Fs n FT = f i ~ ~

If X and Y are c&dl&g, then Xt = Yt a s each t implies that X and Y are indistinguishable, as we have already noted Since fixed times are stopping times, obviously if XT = YT a ~ for each finite stopping time T , then X and

Y are indistinguishable If X is c&dl&g, let A X denote the process AXt =

Xt - Xt- Then A X is not ciidliig, though it is adapted and for a.a w,

t H AXt = 0 except for at most countably many t We record here a useful result

Theorem 7 Let X be adapted and cadlag If A X T ~ { ~ < , ) = 0 a.s for each stopping time T , then A X is indistinguishable from the zero process

Proof It suffices to prove the result on [0, to] for 0 < to < co The set {t : [ A X t [ > 0) is countable a.s since X is chdliig Moreover

Then ~ " > 1 ~ ~ 2 " a.s on {T~)"' < co) Moreover,

where the right side of the equality is a countable union The result follows

Corollary Let X and Y be adapted and c&dl&g If for each stopping time

T , A X T ~ { ~ < , ) = AYT1{T<oo) a s , then A X and AY are indistinguishable

Trang 21

2 Martingales 7

A much more general version of Theorem 7 is true, but it is a very deep result which uses Meyer's "section theorems," and we will not have need of

it See, for example, Dellacherie [41] or Dellacherie-Meyer [45]

A fundamental theorem of measure theory that we will need from time

to time is known as the Modtone Class Theorem Actually there are several such theorems, but the one given here is sufficient for our needs

Definition A monotone vector space 'H on a space R is defined t o be the collection of bounded, real-valued functions f on R satisfying the three conditions:

(i) 'H is a vector space over R;

(ii) la E 'H (i.e., constant functions are in 'H); and

(iii) if (fn)n21 C 'H, and 0 < f l < f i 5 - < f, < - , and limn,, fn = f , and f is bounded, then f E 'H

Definition A collection M of real functions defined on a space R is said t o

be multiplicative if f , g E M implies that f g E M

For a collection of real-valued functions M defined on R, we let a{M)

denote the space of functions defined on R which are measurable with respect

to the a-algebra on R generated by {fP1(A); A E B(R), f E M )

T h e o r e m 8 (Monotone Class Theorem) Let M be a multiplicative class

of bounded real-valued functions defined on a space R, and let A = a{M) If

'H is a monotone vector space containing M , then 'H contains all bounded, A

IF = ( F t ) ~ jti, is assumed t o be right continuous

Definition A real-valued, adapted process X = (Xt)o<t<, is called a mar- tingale(resp supermartingale, submartingale) with respect to the filtra- tion IF if

(i) Xt E L1(dP); that is, E{IXtl) < co;

(ii) if s 5 t, then E{Xt13,) = X,, a.s (resp ~ ( X ~ 1 3 , ) < X,, resp > X,)

Trang 22

8 I Preliminaries

Note that martingales are only defined on [0, co); that is, for finite t and not

t = co It is often possible t o extend the definition to t = co

Definition A martingale X is said to be closed by a random variable Y if

E{IYI) < co and Xt = E{YI.Ft}, 0 5 t < co

A random variable Y closing a martingale is not necessarily unique We give a sufficient condition for a martingale to be closed (as well as a construc- tion for closing it) in Theorem 12

Theorem 9 Let X be a supermartingale The function t ++ E{Xt) i s right continuous if and only if there exists a modification Y of X which is chdldg Such a modification is unique

By uniqueness we mean up to indistinguishability Our standing assump- tion that the "usual hypotheses" are satisfied is used implicitly in the state- ment of Theorem 9 Also, note that the process Y is, of course, also a super- martingale Theorem 9 is proved using Doob's upcrossing inequalities If X is

a martingale then t H E{Xt) is constant, and hence it has a right continuous modification

Corollary If X = (Xt)o<t<cc is a martingale then there exists a unique modification Y of X whichis chdl&g

Since all martingales have right continuous modifications, we will always assume that we are taking the right continuous version, without any special

mention Note that it follows from this corollary and Theorem 2 that a right continuous martingale is cadlag

Theorem 10 (Martingale Convergence Theorem) Let X be a right continuous supermartingale, supolt<m E{IXtl) < co T h e n the random vari- able Y = limt,, Xt a.s exists, and E{IYI) < co Moreover if X is

a martingale closed by a random variable 2 , then Y also closes X and

(ii) SUPaEA E{IUaI) < co, and for every E > 0 there exists 6 > 0 such that

A E 3, P(A) I 6, imply E{IU,lnl) < E

F t denotes the smallest a-algebra generated by (Ft), all t, 0 5 t < co

Trang 23

2 Martingales 9

(iii) There exists a positive, increasing, convex function G(x) defined o n

[0, c o ) such that lim,,, = +co and sup, E{G o 1U,I) < m

The assumption that G i s convex i s not needed for the implications (iii) +

(ii) and (iii) + (2)

Theorem 12 Let X be a right continuous martingale which i s uniformly integrable Then Y = limt,, Xt a.s exists, E{IYI) < co, and Y closes X as

a martingale

Theorem 13 Let X be a (right continuous) martingale Then (Xt)t20 i s uniformly integrable if and only if Y = limt,, Xt exists a.s., E{IYI) < co, and (Xt)o<t<, - - i s a martingale, where X, = Y

If X is a uniformly integrable martingale, then X t converges to X, = Y in

L' as well as almost surely The next theorem we use only once (in the proof

of Theorem 28), but we give it here for completeness The notation (Xn)n10 refers t o a process indexed by the non-positive integers: , X-2, X-1, Xo

Theorem 14 (Backwards Convergence Theorem) Let (Xn)n10 be a

0

martingale T h e n limn,-, X n = E{Xol nn=-, 3 n ) a.s and i n L1

A less probabilistic interpretation of martingales uses Hilbert space theory

Let Y E L 2 ( R , 3 , P ) Since Ft 3 , the spaces L 2 ( R , F t , P ) form a family of Hilbert subspaces of L2(R, 3 , P ) Let "tY denote the Hilbert space projection

of Y onto L 2 ( R , F t , P )

Theorem 15 Let Y E L 2 ( R , 3 , P ) The process Xt = "tY i s a uniformly integrable martingale

Proof It suffices t o show E{YIFt} = "tY The random variable E{YIFt}

is the unique Ft measurable r.v such that JA Y d P = JA E{YIFt}dP, for any event A E Ft We have JA Y d P = JA "tYdP + JA(Y -"t Y)dP But JA(Y -"tY)dP = J l A ( Y -"tY)dP Since l A E L2(R,Ft, P ) , and ( Y -"tY)

is in the orthocomplement of L 2 ( R , F t , P ) , we have J ~ A ( Y - "")dP = 0, and thus by uniqueness E{YIFt} = "tY Since ll"tYllLz 5 llYllLz, by part (iii)

of Theorem 11 we have that X is uniformly integrable (take G(x) = x2) The next theorem is one of the most useful martingale theorems for our purposes

Theorem 16 (Doob's Optional Sampling Theorem) Let X be a right continuous martingale, which i s closed by a random variable X, Let S and

T be two stopping times such that S 5 T a.s Then Xs and XT are integrable and

Xs = E{XTIFS} a.s

Theorem 16 has a similar version for supermartingales

Trang 24

10 I Preliminaries

Theorem 17 Let X be a right continuous supermartingale (resp martin- gale), and let S and T be two bounded stopping times such that S < T a.s

T h e n Xs and XT are integrable and

If T is a stopping time, then so is t A T = min(t, T), for each t 2 0

Definition Let X be a stochastic process and let T be a random time xT

is said t o be the process stopped at T if XT = X t A ~

Note that if X is adapted and ciidl&g and if T is a stopping time, then

is also adapted A martingale stopped at a stopping time is still a martingale,

as the next theorem shows

Theorem 18 Let X be a uniformly integrable right continuous martingale, and let T be a stopping time T h e n xT = (XtAT)O jtjm is also a uniformly integrable right continuous martingale

Proof XT is clearly right continuous By Theorem 16

However for H E Ft we have H 1 {T>t) - E 3 ~ Thus,

xT = XtAT is a martingale for the filtration (Gt)ojtloo given by Gt = F t A ~

Corollary Let Y be an integrable random variable and let S , T be stopping

times Then

Trang 25

2 Martingales 11

Proof Let & = E{YIFt) Then yT is a uniformly integrable martingale and

Interchanging the roles of T and S yields

Finally, E { Y ( ~ ~ A T ) = YSA~

The next inequality is elementary, but indispensable

Theorem 19 (Jensen's Inequality) Let cp : R -, IW be convex, and let X and cp(X) be integrable random variables For any a-algebra Q ,

Corollary 1 Let X be a martingale, and let cp be convex such that cp(Xt)

is integrable, 0 5 t < co Then cp(X) is a submartingale In particular, if M

is a martingale, then lMl is a submartingale

Corollary 2 Let X be a submartingale and let cp be convex, non-decreasing, and such that cp(Xt)olt<, is integrable Then cp(X) is also a submartingale

We end our review of martingale theory with Doob's inequalities; the most important is when p = 2

Theorem 20 Let X be a positive submartingale For all p > 1 , with q con- jugate to p (i.e., : + + = I), we have

We let X* denote sup, IXsI Note that if M is a martingale with M, E L 2 ,

then lMl is a positive submartingale, and taking p = 2 we have

An elementary but useful result concerning martingales is the following

Theorem 21 Let X = (Xt)olt5, be an adapted process with chdlag paths Suppose E{(XTI) < co and E{XT) = 0 for any stopping time T , finite or not Then X is a uniformly integrable martingale

Proof Let 0 5 s < t < co, and let A E F S Let

u , if w E A,

co, i f w $ A

Trang 26

12 I Preliminaries

Then un are stopping times for all u 2 s Moreover

since E{X,,) = 0 by hypothesis, for u 2 s Thus for A E 3, and s < t,

X is a martingale, 0 < t < co

Definition A martingale X with Xo = 0 and E{X:) < co for each t > 0 is called a square integrable martingale If E{X&) < co as well, then X is called an L2 martingale

Clearly, any L2 martingale is also a square integrable martingale See also Sect 3 of Chap IV

The Poisson process and Brownian motion are the two fundamental examples

in the theory of continuous time stochastic processes The Poisson process is the simpler of the two, and we begin with it We recall that we assume given

a filtered probability space ( R , 3 , IF, P ) satisfying the usual hypotheses Let (Tn)n20 be a strictly increasing sequence of positive random variables

We always take To = 0 a.s Recall that the indicator function l { t > ~ n } - is defined

Definition The process N = (Nt)olt<m defined by

with values in W~{co) where W = {0,1,2, ) is called the counting process associated to the sequence (Tn),> 1

If we set T = sup, T, , then

as well as

[Tn,Tn+l) = { N = n), and [T,w) = { N = w)

Trang 27

3 The Poisson Process and Brownian Motion 13

The random variable T is the explosion time of N If T = co as., then N is a

counting process without explosions For T = co, note that for 0 < s < t < co

we have

The increment Nt - N , counts the number of random times Tn that occur

between the fixed times s and t

As we have defined a counting process it is not necessarily adapted t o the filtration IF Indeed, we have the following

Theorem 22 A counting process N i s adapted if and only if the associated random variables (Tn)nL1 are stopping times

Proof If the (Tn)n>o - are stopping times (with To = 0 a.s.), then the event

for each n Thus Nt E Ft and N is adapted If N is adapted, then {Tn < t ) =

{Nt 2 n ) E F t , each t , and therefore Tn is a stopping time

Note that a counting process without explosions has right continuous paths with left limits; hence a counting process without explosions is cadlhg

Definition An adapted counting process N is a Poisson process if (i) for any s, t , 0 < s < t < co, Nt - N3 is independent of 3,;

(ii) for any s, t , u , v , 0 < s < t < co, 0 < u < v < co, t - s = v - u , then the

distribution of Nt - N , is the same as that of N, - Nu

Properties (i) and (ii) are known respectively as increments independent of the past, and stationary increments

Theorem 23 Let N be a Poisson process T h e n

n = 0 , 1 , 2 , , for some X 2 0 That is, Nt has the Poisson distribution with parameter At Moreover, N i s continuous i n probabilit$ and does not have explosions

Proof The proof of Theorem 23 is standard and is often given in more ele-

mentary courses (cf., e.g., Cinlar [33, page 711) We sketch it here

Step 1 For all t 2 0 , P(Nt = 0 ) = e-At, for some constant X 2 0

N is continuous in probability means that for t > 0, limu,t Nu = Nt where the limit is taken in probability

Trang 29

3 The Poisson Process and Brownian Motion 15

= P(Nt = 0) + o P ( N t = 1) + a n P ( N t = n),

n=2 and $(a) = cpl(0), the derivative of cp at 0 Therefore

cp(t) - 1 = lim P(Nt = 0) - 1 a P ( N t = 1) 1

= -A + Aa

Therefore cp(t) = e-At+Aat, hence

Equating coefficients of the two infinite series yields

Definition The parameter A associated to a Poisson process by Theorem 23

is called the intensity, or arrival rate, of the process

Corollary A Poisson process N with intensity A satisfies

E{Nt) = At, Variance(Nt) = Var(Nt) = At

The proof is trivial and we omit it

There are other, equivalent definitions of the Poisson process For example,

a counting process N without explosion can be seen t o be a Poisson process iffor all s , t , 0 5 s < t < co, E{Nt) < co and

T h e o r e m 24 Let N be a Poisson process with intensity A Then Nt -At and (Nt - At)2 - At are martingales

Proof Since At is non-random, the process Nt - At has mean zero and inde- pendent increments Therefore

for 0 < s < t < co The analogous statement holds for (Nt - At)2 - At

Trang 30

16 I Preliminaries

Definition Let H be a stochastic process The natural filtration of H, denoted IF0 = (fl)olt<m, is defined by fl = a{Hs; s 5 t) That is, fl is the smallest filtration that makes H adapted

Note that natural filtrations are not assumed t o contain all the P-null sets

Define the maps .rrt : R 4 I' by

Thus the range of rrt is contained in the set of functions constant after t The a-algebra fl is also generated by the single function space-valued random variable rrt

Let A be an event in nn, - e+ $ Then there exists a set An E BIE lo,m) BS such that A = { r r t + ~ E A,) Next set Wn = {.rrt = .rrt+;) For each w,

there exists an n such that s ++ N,(w) is constant on [t, t + i]; therefore

R = UnLl Wn, where Wn is an increasing sequence of events Therefore

= lim(Wn n {.rrt+& E An))

We next turn our attention to the Brownian motion process Recall that

we are assuming as given a filtered probability space (R, 3, IF, P ) that satisfies the usual hypotheses

Definition An adapted process B = (Bt)olt<co taking values in IWn is called

an n-dimensional Brownian motion if

(i) for 0 5 s < t < co, Bt - B, is independent of F, (increments are indepen- dent of the past);

(ii) for 0 < s < t , Bt - B, is a Gaussian random variable with mean zero and variance matrix (t s)C, for a given, non-random matrix C

Trang 31

3 The Poisson Process and Brownian Motion 17

The Brownian motion starts at x if P ( B o = x) = 1

The existence of Brownian motion is proved using a path-space construc- tion, together with Kolmogorov's Extension Theorem It is simple t o check that a Brownian motion is a martingale as long as E{IBol) < cm Therefore

by Theorem 9 there exists a version which has right continuous paths, a s Actually, more is true

Theorem 26 Let B be a Brownian motion Then there exists a modification

of B which has continuous paths a.s

Theorem 26 is often proved in textbooks on probability theory (e.g., Breiman [23]) It can also be proved as an elementary consequence of Kol-

mogorov's Lemma (Theorem 72 of Chap IV) W e will always assume that we

are using the version of Brownian motion with continuous paths We will also assume, unless stated otherwise, that C is the identity matrix We then say

that a Brownian motion B with continuous paths, with C = I the identity matrix, and with Bo = x for some x E Rn, is a standard Brownian motion

Note that for an Rn standard Brownian motion B, writing Bt = (Btl, , BF),

0 < t < cm, then each Bi is an R1 Brownian motion with continuous paths, and the B"S are independent

We have already observed that a Brownian motion B with E{JBoJ) < cm

is a martingale Another important elementary observation is the following

Theorem 27 Let B = (Bt)olt<, be a one dimensional standard Brownian

motion with Bo = 0 Then Mt = Bt2 - t is a martingale

Proof E{Mt ) = E{B: - t ) = 0 Also

and

E{BtBsIFs) = BsE{BtlFs) = B:, since B is a martingale with B,, Bt E L2 Therefore

due to the independence of the increments from the past

Theorem 28 Let ir, be a sequence ofpartitions of [a, a+t] Suppose ir, C .ir,

i f m > n (that is, the sequence is a refining sequence) Suppose more- over that lim,,, mesh(.ir,) = 0 Let .ir,B = CtZErr, (Bti+l - Bti)2 Then

lirn,,, .ir, B = t a.s., for a standard Brownian motion B

Trang 32

18 I Preliminaries

Proof We first show convergence in mean square We have

where Yi are independent random variables with zero means Therefore

Next observe that (Bti+l - ~ ~ , ) ~ / ( t i + l - t i ) has the distribution of Z2, where

Z is Gaussian with mean 0 and variance 1 Therefore

which tends to 0 as n tends to cm This establishes L2 convergence (and hence convergence in probability as well)

To obtain the a s convergence we use the Backwards Martingale Conver- gence Theorem (Theorem 14) Define

for n = -1, -2, - 3 , Then it is straightforward (though notationally messy) to show that

Therefore Nn is a martingale relative to Gn = a{Nk, k 5 n ) , n = -1, -2,

By Theorem 14 we deduce limn,-, Nn = limn,, r n B exists a s , and since

r n B converges to t in L2, we must have limn,, r n B = t a s as well

Comments As noted in the proofs, the proof is simple (and half as long) if we conclude only L2 convergence (and hence convergence in probability), instead

of a s convergence Also, we can avoid the use of the Backwards Martingale Convergence Theorem (Theorem 14) in the second half of the proof if we add the hypothesis that En mesh(rn) < co The result then follows, after having proved the L2 convergence, by using the Borel-Cantelli Lemma and Chebsyshev's inequality Furthermore to conclude only L2 convergence we do not need the hypothesis that the sequence of partitions be refining

Theorem 28 can be used to prove that the paths of Brownian motion are of unbounded variation on compacts It is this fact that is central to the difficulties in defining an integral with respect to Brownian motion (and martingales in general)

Trang 33

4 L6vy Processes 19

Theorem 29 For almost all w, the sample paths t H Bt(w) of a standard Brownian motion B are of unbounded variation on any interval

Proof Let A = [a, b] be an interval The variation of paths of B is defined to

where P are all finite partitions of [a, b] Suppose P(VA < co) > 0 Let .ir,

be a sequence of refining partitions of [a, b] with limn mesh(.ir,) = 0 Then by Theorem 28 on {VA < m},

we conclude VA = m a s Since the null set can depend on the interval [a, b],

we only consider intervals with rational endpoints a, b with a < b Such a collection is countable, and since any interval (a, b) = U;==,[a,, b,] with a,,

b, rational, we can omit the dependence of the null set on the interval

We conclude this section by observing that not only are the increments of standard Brownian motion independent, they are also stationary Thus Brow- nian motion is a Lkvy process (as is the Poisson process), and the theorems

of Sect 4 apply to it In particular, by Theorem 31 of Sect 4, we can con- clude that the completed natural filtration of standard Brownian motion is right continuous

The Lkvy processes, which include the Poisson process and Brownian motion

as special cases, were the first class of stochastic processes to be studied in the modern spirit (by the French mathematician Paul Lkvy) They still pro- vide prototypic examples for Markov processes as well as for semimartingales Most of the results of this section hold for Rn-valued processes; for notational simplicity, however, we will consider only R-valued processes.4 Once again

we recall that we are assuming given a filtered probability space (a, F , IF, P ) satisfying the usual hypotheses

Rn denotes n-dimensional Euclidean space R+ = [O, m) denotes the non-negative real numbers

Trang 34

Note that it is not necessary to involve the filtration IF in the definition of

a L6vy process Here is a (less general) alternative definition; to distinguish the two, we will call it an intrinsic Ldvy process

Definition An process X = (Xt)tlo with Xo = 0 a s is an intrinsic LQvy

(iii) Xt is continuous in probability

Of course, an intrinsic L6vy process is a L6vy process for its minimal (completed) filtration

If we take the Fourier transform of each Xt we get a function f (t, u) = ft (u) given by

where fo(u) = 1, and ft+,(u) = ft(u)fs(u), and ft(u) # 0 for every (t, u) Using the (right) continuity in probability we conclude ft(u) = exp{-t$(u)), for some continuous function $(u) with $(O) = 0 (Bochner's Theorem can

be used to show the converse If $ is continuous, $(0) = 0, and if for all

t 2 0, f t (u) = e c t + ( ~ ) satisfies xi,? f t (ui - uj) 2 0, for all finite (ul, , 21,; a1 , , a,), then there exists a Ltvy process corresponding to

f .)

In particular it follows that if X is a L6vy process then for each t > 0, Xt has an infinitely divisible distribution Inversely it can be shown that for each infinitely divisible distribution p there exists a L6vy process X such that p is the distribution of X I

Theorem 30 Let X be a Le'vy process There exists a unique modification

Y of X which is ccidlcig and which is also a Le'vy process

Trang 35

4 L6vy Processes 21

t H M ~ ( w ) and t H eiuxt(w), with t E Q+, are the restrictions to Q+ of cbdlAg functions Let

A = {(w,u) E fl x R : eiuXt(W),t E Q +,

is not the restriction of a cadlag function)

One can check that A is a measurable set F'urthermore, we have seen that

J lA(w, u)P(dW) = 0, each u E W By Fubini's Theorem

hence we conclude that for a.a w the function t H eiUxdw), t E Q+ is the restriction of a chdlAg function for almost all u E W We can now conclude that the function t H Xt(w), t E Q+, is the restriction of a cbdlhg function for every such w, with the help of the lemma that follows the proof of this theorem

Next set K(w) = limsGq+,s~t Xs(w) for all w in the projection onto fl of {fl x W) \ A and Yt = 0 on A, all t Since Ft contains all the P-null sets

of F and (Ft)olt<m is right continuous, Yt E F t Since X is continuous in probability, P{Yt # X t ) = 0, hence Y is a modification of X It is clear that

Y is a Lkvy process as well

The next lemma was used in the proof of Theorem 30 Although it is a pure analysis lemma, we give a proof using probability theory

Lemma Let x, be a sequence of real numbers such that eiuxn converges as

n tends t o cm for almost all u E W Then x, converges to a finite limit

Proof We will verify the following Cauchy criterion: x, converges if for any increasing sequences nk and mk, then limk,, x,, - xmk - 0 Let U be a

random variable which has the uniform distribution on [0, 11 For any real t ,

by hypothesis a s eituxnk and eituxmk converge to the same limit Therefore,

so that the characteristic functions converge,

for all t E W Consequently (x,, - xmk)U converges to zero in probability, whence limk,, x,, - xmk = 0, as claimed

We will henceforth always assume that we are using the (unique) cadlag version of any given Lkvy process Lkvy processes provide us with examples

of filtrations that satisfy the "usual hypotheses," as the next theorem shows

Trang 36

22 I Preliminaries

Theorem 31 Let X be a Lkvy process and let Gt = e v N , where ( e ) o l t < o o

is the naturalfiltration ofX, and N are the P-null sets of F Then ( G t ) ~ < t < ~

and the same martingale argument yields

3 IGt} for all ( s l , , s,) and

It follows that E{eiCUjX"j lGt+) = E{eiCUjXs

all ( ~ 1 , , un), whence E{ZlGt+) = E{ZlGt} for every bounded Z 6

Vols<oo e This implies Gt+ = Gt except possibly for events of probabil- ity zero However since both a-algebras contain N, we conclude Gt+ = Gt for each t 2 0

The next theorem shows that a Lkvy process "renews itself" at stopping times

Trang 37

4 L6vy Processes 23 Theorem 32 Let X be a Lkvy process and let T be a stopping time On

the set {T < m } the process Y = ( Y t ) ~ < t < ~ defined by Yt = X T + ~ - XT is a

Lkvy process adapted to 7 i t = FT+t, Y is independent of FT and Y has the

same distribution as X

Proof First assume T is bounded Let A E FT and let ( u l , , u,; to, , t,)

be given with uj in a countable dense set (for example the rationals Q) and

t j E EX+, t j increasing with j

Recall that M,U' = $$$ is a martingale, where f t (uj) = E{e'ujXt) Then

by applying the Optional Sampling Theorem (Theorem 16) n times Note that

this shows the independence of Yt = X T + ~ - XT from FT as well as showing

that Y has independent and stationary increments and that the distribution

of Y is the same as that of X

If T is not bounded, we let Tn = min(T, n) = T A n The formula is valid

for A, = An{T 5 n ) when A E F T , since then A, E FTAn Taking limits and

using the Dominated Convergence Theorem we see that our formula holds for

unbounded T as well, for events A = A n {T < m ) , A E FT This gives the

result

Since a standard Brownian motion is a L6vy process, Theorem 32 gives

us a fortiori the strong Markov property for Brownian motion This allows

us t o establish a pretty result for Brownian motion, known as the reflection

principle Let B = (Bt)tlo denote a standard Brownian motion, Bo = 0 a s ,

and let St = supolslt B,, the maximum process of Brownian motion Since

B is continuous, St = S U ~Bu , where ~ Q ~ denotes the rationals; hence ~ ~ ~ , ~ ~ ~

St is an adapted process with non-decreasing paths

Theorem 33 (Reflection Principle for Brownian Motion) Let B =

(Bt)t20 be standard Brownian motion (Bo = 0 a.s.) and St = supolslt B,,

the Brownian mm.mum process For y 2 0, z > 0,

Proof Let T = inf{t > 0 : Bt = z) Then T is a stopping time by Theorem 4,

and P ( T < m ) = 1 We next define a new process X by

Trang 38

by the construction of X Therefore

The left side of (*) equals

where the last equality is a consequence of the containment { S t 2 z ) > { B t >

z + y ) Also the right side of (*) equals P ( S t 2 z; Bt < z - y) Combining these yields

P ( S t 2 z ; B t < z - y ) = P ( B t > z + y ) ,

which is what was to be proved

We also have a reflection principle for Lkvy processes See Exercises 30

and 31

Corollary Let B = (Bt)t>o - be standard Brownian motion ( B o = 0 a s ) and

St = S U ~B s ~For <z > ~0, ~ ~

Proof Take y = 0 in Theorem 33 Then

Adding P ( B t > z ) to both sides and noting that { B t > z ) = { B t > z ) n {St 2

z ) yields the result since P ( B t = z ) = 0

Trang 39

4 L6vy Processes 25

A L6vy process is c&dl&g, and hence the only type of discontinuities it can have is jump discontinuities Letting Xt- = limSTt X,, the left limit at t , we define

AX, = xt - xt-,

the jump at t If sup, IAXtI 5 C < cm a s , where C is a non-random constant,

Our next result states that a L6vy process with bounded jumps has finite moments of all orders This fact was used in Sect 3 (Step 2 of the proof of Theorem 23) to show that E{Nl) < cm for a Poisson process N

Theorem 34 Let X be a LLvy process with bounded jumps Then E{IXt In) <

cm for all n = 1 , 2 , 3 ,

Proof Let C be a (non-random) bound for the jumps of X Define the stop- ping times

Since the paths are right continuous, the stopping times (Tn),?l form

a strictly increasing sequence Moreover [AXT[ 5 C by hypothesis for any

stopping time T Therefore sup, IX,T" I 5 2nC by recursion Theorem 32 im-

plies that Tn - Tn-1 is independent of FT"-~ and also that the distribution

of Tn - Tn-1 is the same as that of TI

The above implies that

for s o m e a , 0 < a < 1 But also

which implies that Xt has an exponential moment and hence moments of all orders

We next turn our attention to an analysis of the jumps of a LLvy process

Let A be a Bore1 set in IR bounded away from 0 (that is, 0 @ z, where is the closure of A) For a L6vy process X we define the random variables

T,"+' = inf { t > Tz : AXt E A)

Trang 40

26 I Preliminaries

Since X has chdlhg paths and 0 @ z, the reader can readily check that { T i 2

t ) E Ft+ = Ft and therefore each TT is a stopping time Moreover 0 @ ;I and

c&dl&g paths further imply T; > 0 a s and that lim,,, = cm a s We define

00

and observe that N" is a counting process without an explosion It is straight-

forward to check that for 0 < s < t < co,

and therefore N,/' - N: is independent of FS; that is, N" has independent

increments Note further that N,/' - N b is the number of jumps that Zu =

XS+U - X , has in A, 0 5 u < t - S By the stationarity of the distributions of

X , we conclude N,/' - N,/' has the same distribution as N L , Therefore N" is

a counting process with stationary and independent increments W e conclude that N" is a Poisson process Let v ( A ) = E { N f ) be the parameter of the

Poisson process N" ( v ( A ) < cm by the proof of Theorem 34)

Theorem 35 The set function A H N,/'(w) defines a a-finite measure on

R \ (0) for each fixed ( t , w ) The set function v ( A ) = E{N:) also defines a a-finite measure on R \ (0)

Proof The set function A ++ N,/'(w) is simply a counting measure: p ( A ) =

{number of s < t : AX,(w) E A) It is then clear that v is also a measure

Definition The measure v defined by

is called the LQvy measure of the L6vy process X

We wish to investigate further the role the Lkvy measure plays in governing the jumps of X To this end we establish a preliminary result We let Nt(w, d x )

denote the random measure of Theorem 35 Since Nt(w, d x ) is a counting

measure, the next result is obvious

Theorem 36 Let A be a Borel set of R, 0 @ 2, f Borel and finite on A Then

f ( x ) N t ( w , dx) = C f ( A X s ) l n ( A X s )

O<s<t

Just as we showed that N,/' has independent and stationary increments,

we have the following consequence

Ngày đăng: 08/04/2014, 12:25

Nguồn tham khảo

Tài liệu tham khảo Loại Chi tiết
1. H . Ahn and P. Protter. A remark on stochastic integration. In Siminaire de Probabilit6s XXVIII, volume 1583 of Lecture Notes i n Mathematics, pages 312-315. Springer-Verlag, Berlin, 1994 Sách, tạp chí
Tiêu đề: Siminaire de Probabilit6s "XXVIII, volume 1583 of "Lecture Notes i n Mathematics
2. L. Arnold. Stochastic Differential Equations: Theory and Applications. Wiley, New York, 1973 Sách, tạp chí
Tiêu đề: Stochastic Differential Equations: Theory and Applications
3. 3. Az6ma. Quelques applications de la th6orie g6n6rale des processus. Invent. Math., 18:293-336, 1972 Sách, tạp chí
Tiêu đề: Invent. "Math
4. J . Az6ma. Sur les ferm6s al6atoires. In Siminaire de Probabilit6s XIX, volume 1123 of Lecture Notes i n Mathematics, pages 397-495. Springer-Verlag, Berlin, 1985 Sách, tạp chí
Tiêu đề: Siminaire de Probabilit6s "XIX, volume 1123 of "Lecture Notes i n Mathematics
5. J. Azkma and M. Yor. Etude d'une martingale remarquable. In Siminaire de Probabilitis XXIII, volunle 1372 of Lecture Notes i n Mathematics, pages 88-130. Springer-Verlag, Berlin, 1989 Sách, tạp chí
Tiêu đề: Siminaire de Probabilitis "XXIII, volunle 1372 of "Lecture Notes i n Mathematics
6. X. Bardina and M. Jolis. An extension of It6's formula for elliptic diffusion processes. Stochastic Processes and their Appl., 69:83-109, 1997 Sách, tạp chí
Tiêu đề: Stochastic Processes and their Appl
7. M. T. Barlow. Study of a filtration expanded to include an honest time. 2. Wahrscheinlichkeitstheorie verw. Gebiete, 44:307-323, 1978 Sách, tạp chí
Tiêu đề: 2. "Wahrscheinlichkeitstheorie verw. Gebiete
8. M. T. Barlow. On the left end points of Brownian excursions. In Siminaire de Probabilitb XIII, volume 721 of Lecture Notes i n Mathematics, page 646.Springer-Verlag, Berlin, 1979 Sách, tạp chí
Tiêu đề: On the left end points of Brownian excursions
Tác giả: M. T. Barlow
Nhà XB: Springer-Verlag
Năm: 1979
9. M. T. Barlow, S. D. Jacka, and M. Yor. Inequalities for a pair of processes stopped a t a random time. Proc. London Math. Soc., 52:142-172, 1986 Sách, tạp chí
Tiêu đề: Proc. London Math. "Soc
10. R. F. Bass. Probabilistic Techniques i n Analysis. Springer-Verlag, New York, 1995 Sách, tạp chí
Tiêu đề: Probabilistic Techniques i n Analysis
11. R . F. Bass. The Doob-Meyer decomposition revisited. Canad. Math. Bull., 39:138-150, 1996 Sách, tạp chí
Tiêu đề: Canad. Math. Bull
12. J . Bertoin. L6vy Processes. Cambridge University Press, Cambridge, 1996 Sách, tạp chí
Tiêu đề: L6vy Processes
13. K. Bichteler. Stochastic integrators. Bull. Amer. Math. Soc., 1:761-765, 1979 Sách, tạp chí
Tiêu đề: Bull. Amer. Math. Soc
14. K. Bichteler. Stochastic integration and LP-theory of semimartingales. Ann. Probab., 9:49-89, 1981 Sách, tạp chí
Tiêu đề: Ann. "Probab
15. K . Bichteler. Stochastic Integration W i t h Jumps. Cambridge University Press, Cambridge, 2002 Sách, tạp chí
Tiêu đề: Stochastic Integration With Jumps
Tác giả: K. Bichteler
Nhà XB: Cambridge University Press
Năm: 2002
16. K . Bichteler, J.-B. Gravereaux, and J. Jacod. Malliavin Calculus for Processes W i t h Jumps. Gordon and Breach, 1987 Sách, tạp chí
Tiêu đề: Malliavin Calculus for Processes W i t h Jumps

TỪ KHÓA LIÊN QUAN