Experience to the level of third year orfourth year university courses delivered by a mathematics department on - foundational real and complex analysis, - elementary functional analysis
Trang 1Berlin Heidelberg New York
Hong Kong London
Milan Paris Tokyo
Trang 3In 2003 I began teaching a course entitled L´evy processes on the Utrecht masters programme in stochastics and financial mathematics Quitenaturally, I wanted to expose to my students my own interests in L´evy pro-cesses That is the role that certain subtle behaviour concerning their fluctu-ations explain different types of phenomena appearing in a number of clas-sical models of applied probability (as a general rule that does not necessarilyinclude mathematical finance) Indeed, recent developments in the theory ofL´evy processes, in particular concerning path fluctuation, has offered the clar-ity required to revisit classical applied probability models and improve on wellestablished and fundamental results Results which were initially responsiblefor the popularity of the models themselves
Amsterdam-Whilst giving the course I wrote some lecture notes which have now tured into this text Given the audience of students, who were either engaged
ma-in their ‘afstudeerfase’ (equivalent to masters for the U.S or those Europeancountries which have engaged in that system) or just starting a Ph.D., theselecture notes were originally written with the restriction that the mathematicsused would not surpass the level that they should in principle have reached.That roughly means the following Experience to the level of third year orfourth year university courses delivered by a mathematics department on
- foundational real and complex analysis,
- elementary functional analysis (specifically basic facts about Lp spaces),
- measure theory, integration theory and measure theoretic probability theory,
- elements of the classical theory of Markov processes, stopping times and theStrong Markov Property, Poisson processes and renewal processes,
- an understanding of Brownian motion as a Markov process and elementarymartingale theory in continuous time
For the most part this affected the way in which the material was handledcompared to the classical texts and research papers from which almost all ofthe results and arguments in this text originate Likewise the exercises are
Trang 4pitched at a level appropriate to this audience Indeed several of the cises have been included in response to some of the questions that have beenasked by students themselves concerning curiosities of the arguments given inclass Arguably some of the exercises are at times quite long Such exercisesreflect some of the other ways in which I have used preliminary versions ofthis text A small number of students in Utrecht also used the text as anindividual reading programme contributing to their ‘kleinescripite’ (extendedmathematical essay) or ’onderzoekopdracht’ (research option) The extendedexercises were designed to assist with this self-study programme In addition,some exercises were used as examination questions.
exer-There can be no doubt, particularly to the more experienced reader, thatthe current text has been heavily influenced by the outstanding books ofBertoin (1996) and Sato (1999); especially the former which also takes a pre-dominantly pathwise approach to its content It should be reiterated howeverthat, unlike the latter two books, this text is not aimed at functioning as aresearch monograph nor a reference manual for the researcher
Andreas E Kyprianou
Edinburgh
2005
Trang 51 L´evy processes and applications 1
1.1 L´evy processes and infinitely divisibility 1
1.2 Some examples of L´evy processes 5
1.3 L´evy processes in classical applied probability models 14
Exercises 21
2 The L´evy-Itˆo decomposition and path structure 27
2.1 The L´evy-Itˆo decomposition 27
2.2 Poisson point processes 29
2.3 Functionals of Poisson point processes 34
2.4 Square integrable martingales 38
2.5 Proof of the L´evy-Itˆo decomposition 45
2.6 L´evy processes distinguished by their path type 47
2.7 Interpretations of the L´evy-Itˆo decomposition 50
Exercises 55
3 More distributional and path related properties 61
3.1 The Strong Markov Property 61
3.2 Duality 66
3.3 Exponential moments and martingales 68
Exercises 75
4 General storage models and paths of bounded variation 79
4.1 Idle times 80
4.2 Change of variable and compensation formulae 81
4.3 The Kella-Whitt martingale 88
4.4 Stationary distribution of the workload 91
4.5 Small-time behaviour and the Pollaczeck-Khintchine formula 93
Exercises 96
Trang 65 Subordinators at first passage and renewal measures 101
5.1 Killed subordinators and renewal measures 101
5.2 Overshoots and undershoots 110
5.3 Creeping 112
5.4 Regular variation and Tauberian theorems 115
5.5 Dynkin-Lamperti asymptotics 120
Exercises 124
6 The Wiener-Hopf factorization 129
6.1 Local time at the maxiumum 130
6.2 The ladder process 136
6.3 Excursions 143
6.4 The Wiener-Hopf factorization 145
6.5 Examples of the Wiener-Hopf factorization 158
Exercises 163
7 L´evy processes at first passage and insurance risk 169
7.1 Drifting and oscillating 169
7.2 Cram´er’s estimate of ruin 175
7.3 A quintuple law at first passage 179
7.4 The jump measure of the ascending ladder height process 184
7.5 Creeping 186
7.6 Regular variation and infinite divisibility 189
7.7 Asymptotic ruinous behaviour with regular variation 192
Exercises 195
8 Exit problems for spectrally negative processes 201
8.1 Basic properties reviewed 201
8.2 The one- and two-sided exit problems 203
8.3 The scale functions W(q) and Z(q) 209
8.4 Potential measures 212
8.5 Identities for reflected processes 216
Exercises 220
9 Applications to optimal stopping problems 227
9.1 Sufficient conditions for optimality 227
9.2 The McKean optimal stopping problem 229
9.3 Smooth fit versus continuous fit 233
9.4 The Novikov-Shiryaev optimal stopping problem 237
9.5 The Shepp-Shiryaev optimal stopping problem 244
9.6 Stochastic games 249
Exercises 257
References 259
Trang 7In this chapter we define a L´evy process and attempt to give some tion of how rich a class of processes they form To illustrate the variety ofprocesses captured within the definition of a L´evy process, we shall explorebriefly the relationship of L´evy processes with infinitely divisible distributions
indica-We also discuss some classical applied probability models which are built onthe strength of well understood path properties of elementary L´evy processes
We hint at how generalizations of these models may be approached using moresophisticated L´evy processes At a number of points later on in this text weshall handle these generalizations in more detail The models we have chosen
to present are suitable for the course of this text as a way of exemplifyingfluctuation theory but are by no means the only applications
1.1 L´ evy processes and infinitely divisibility
Let us begin by recalling the definition of two familiar processes, a Brownianmotion and a Poisson process
A real valued process B = {Bt : t ≥ 0} defined on a probability space(Ω,F, P) is said to be a Brownian motion if the following hold
(i) The paths of B are P-almost surely continuous
(ii) P(B0 = 0) = 1
(iii) For each t > 0, Bt is equal in distribution to a normal random variablewith variance t
(iv) For 0 ≤ s ≤ t, Bt− Bs is independent of {Bu : u ≤ s}
(v) For 0≤ s ≤ t, Bt − Bs is equal in distribution to Bt−s
A process valued on the non-negative integers N = {Nt : t ≥ 0}, defined
on a probability space (Ω,F, P) is said to be a Poisson process with intensity
λ > 0 if the following hold
(i) The paths of N are P-almost surely right continuous with left limits
Trang 8(ii) P(N0 = 0) = 1.
(iii) For each t > 0, Nt is equal in distribution to a Poisson random variablewith parameter λt
(iv) For 0 ≤ s ≤ t, Nt− Ns is independent of {Nu : u ≤ s}
(v) For 0≤ s ≤ t, Nt − Ns is equal in distribution to Nt−s
On first encounter, these processes would seem to be considerably differentfrom one another Firstly, Brownian motion has continuous paths whereasPoisson processes do not Secondly, a Poisson processes is a non-decreasingprocess and thus has paths of bounded variation over finite time horizons,whereas a Brownian motion does not have monotone paths and in fact itspaths are of unbounded variation over finite time horizons
However, when we line up their definitions next to one another, we seethat they have a lot in common Both processes have right continuous pathswith left limits, are initiated from the origin and both have stationary andindependent increments; that is properties (i), (ii), (iv) and (v) We may usethese common properties to define a general class of stochastic processes whichare called L´evy processes
Definition 1.1 (L´evy Process) A process X = {Xt : t ≥ 0} defined on
a probability space (Ω,F, P) is said to be a L´evy processes if it possesses thefollowing properties
(i) The paths of X are right continuous with left limits P-almost surely.(ii) P(X0 = 0) = 1
(iii) For 0≤ s ≤ t, Xt− Xs is independent of {Xu : u≤ s}
(iv) For 0 ≤ s ≤ t, Xt − Xs is equal in distribution to Xt −s
Unless otherwise stated, from now on, when talking of a L´evy process, weshall always use the measure P (with associated expectation operator E) to beimplicitly understood as its law
The name ‘L´evy process’ honours the work of the French mathematicianPaul L´evy who, although not alone in his contribution, played an instrumentalrole in bringing together an understanding and characterization of processeswith stationary independent increments In earlier literature, L´evy processescan be found under a number of different names In the 1940s, L´evy himselfreferred them as a sub class of processus additif (additive processes), that isprocesses with independent increments For the most part however, researchliterature through the 1960s and 1970s refers to L´evy proceses simply as pro-cesses with stationary independent increments One sees a change in languagethrough the 1980s and by the 1990s the use of the term L´evy process hadbecome standard
From Definition 1.1 alone it is difficult to see just how rich a class ofprocesses the class of L´evy processes forms De Finetti (1929) introduced the
Trang 91.1 L´evy processes and infinitely divisibility 3
notion of an infinitely divisible distribution and showed that they have anintimate relationship with L´evy processes This relationship gives a reasonablygood impression of how varied the class of L´evy processes really is To this end,let us now devote a little time to discussing infinitely divisible distributions.Definition 1.2 We say that a real valued random variable Θ has an infinitelydivisible distribution if for each n = 1, 2, there exist a sequence of i.i.d.random variables Θ1,n, , Θn,n such that
Θ = Θd 1,n+ + Θn,n
where = is equality in distribution Alternatively, we could have expressed thisdrelation in terms of probability laws That is to say, the law µ of a real valuedrandom variable is infinitely divisible if for each n = 1, 2, there exists an-other law µn of a real valued random variable such that µ = µ∗nn A third equi-valent defintion can be reworded in terms of characteristic exponents Supposethat Θ has characteristic exponent Ψ (u) := − log E(eiuΘ) for all u∈ R Then
Θ has an infintely divisible distribution if for all n ≥ 1 there exists a terisitc exponent of a probability distribution, say Ψn such that Ψ (u) = nΨn(u)for all u∈ R
charac-The full extent to which we may characterize infinitely divisible tions is done via their characteristic exponent Ψ and an expression known asthe L´evy-Khintchine formula
distribu-Theorem 1.3 (L´evy-Khintchine formula) A probability law µ of a realvalued random variable is infinitely divisible with characteristic exponent Ψ,
Definition 1.4 The measure Π is called the L´evy (characteristic) measure.The proof of the L´evy-Khintchine characterization of infinitely divisiblerandom variables is quite complicated and we choose to exclude it in favor
of moving as quickly as possible to fluctuation theory The interested reader
is referred to Lukacs (1970) or Sato (1999) for example to name but two ofmany possible references
A special case of the L´evy-Khintchine formula was established by Kolmogorov(1932) for infinitely divisible distributions with second moments However it
Trang 10was L´evy (1934, 1935) who gave a complete characterization of infinitely visible distributions and in doing so he also characterized the general class ofprocesses with stationary independent increments Later, Khintchine (1937)and Itˆo (1942) gave further simplification and deeper insight to L´evy’s originalproof.
di-Let us now make firm the relationship between infinitely divisible butions and processes with stationary independent increments
distri-From the definition of a L´evy process we see that for any t > 0, Xt is
a random variable belonging to the class of infinitely divisible distributions.This follows from the fact that for any n = 1, 2,
Xt = Xt/n+ (X2t/n− Xt/n) + + (Xt− X(n −1)t/n) (1.1)together with the fact that X has stationary independent increments Supposenow that we define for all θ ∈ R, t ≥ 0
Ψt(θ) = − log E eiθXtthen using (1.1) twice we have for any two positive integers m, n that
mΨ1(θ) = Ψm(θ) = nΨm/n(θ)and hence for any rational t > 0
Ψt(θ) = tΨ1(θ) (1.2)
If t is an irrational number, then we can choose a decreasing sequence ofrationals {tn : n ≥ 1} such that tn ↓ t as n tends to infinity Almost sureright continuity of X implies right continuity of exp{−Ψt(θ)} (by dominatedconvergence) and hence (1.2) holds for all t ≥ 0
In conclusion, any L´evy process has the property that
E eiθXt
= e−tΨ (θ)where Ψ (θ) := Ψ1(θ) is the characteristic exponent of X1 which has an infin-itely divisible distribution
Definition 1.5 In the sequel we shall also refer to Ψ (θ) as the characteristicexponent of the L´evy process
It is now clear that each L´evy process can be associated with an infinitelydivisible distribution What is not clear is whether given an infinitely divisibledistribution, one may construct a L´evy process X, such that X1 has thatdistribution This latter issue is resolved by the following theorem which givesthe L´evy-Khintchine formula for L´evy processes
Theorem 1.6 (L´evy-Khintchine formula for L´evy processes) Supposethat a ∈ R, σ ≥ 0 and Π is a measure on R \{0} such that RR \{0}(1 ∧
|x|2)Π(dx) < ∞ From this triple define for each θ ∈ R
Trang 111.2 Some examples of L´evy processes 5
The proof of this theorem is rather complicated however very rewarding
as it also reveals much more about the general structure of L´evy processes.Later, in Chapter 2, we shall prove a stronger version this theorem which alsoexplains the path structure of the L´evy process associated with each Ψ (θ) interms of the triple (a, σ, Π)
1.2 Some examples of L´ evy processes
To conclude our introduction to L´evy processes and infinite divisible butions, let us proceed to some concrete examples Some of which will also be
distri-of use later to verify certain results from the forthcoming fluctuation theory
we shall present
1.2.1 Poisson processes
For each λ > 0 consider a probability distribution µλ which is supported
on k = 0, 1, 2 such that µλ({k}) = e−λλk/k! That is to say the Poissondistribution An easy calculation reveals that
on 1
Recall that a Poisson processes, N , is a L´evy process with distribution attime t > 0 which is Poisson with parameter λt It is clear from the abovecalculations then that
E(eiθNt) = e−λt(1−eiθ)and hence its characteristic exponent Ψ (θ) = λ(1− eiθ) for θ ∈ R
1.2.2 Compound Poisson processes
Suppose now that N is a Poisson random variable with parameter λ > 0 andthat {ξi : i ≥ 1} is an i.i.d sequence of random variables with common law Fhaving no atom at zero By first conditioning on N , we have for θ ∈ R
Trang 12E(eiθPNi=1 ξ i) = X
1 = 0 We see then that distributions
of the formPN
i=1ξi are infinitely divisible with triple a =−λR0<|x|<1xF (dx),
σ = 0 and Π(dx) = λF (dx) When F has an atom of unit mass at 1 then wehave simply a Poisson distribution
Suppose now that N ={Nt : t≥ 0} is a Poisson process with intensity λ.Now consider a compound Poisson process {Xt : t ≥ 0} defined by
Nt for the variable N to discover that the L´evy-Khintchine formula for acompound Poisson process takes the form Ψ (θ) = λR
R(1− eiθx)F (dx) Notethen that the L´evy measure of a compound Poisson process is always finitewith total mass equal to the rate λ of a the underlying process N
Compound Poisson processes provide a direct link between L´evy processesand random walks; that is discrete time processes of the form S ={Sn : n≥ 0}where
1.2.3 Scaled Brownian motion with drift
Take the probability law
Trang 131.2 Some examples of L´evy processes 7
µs,γ(dx) := √ 1
2πs2e−(x−γ)2/2s2dxsupported on R where γ ∈ R and s > 0; the well known Gaussian distributionwith mean γ and variance s2 It is well known that
showing again that it is an infinitely divisible distribution, this time with
a = −γ, σ = s and Π = 0 Infinite divisibility in this case may also be seen
to follow from the fact that linear sums of independent Gaussian randomvariables are again Gaussian
We immediately recognize the characteristic exponent Ψ (θ) = s2θ2/2−iθγ
as also that of a scaled Brownain motion with linear drift,
Xt := sBt+ γt, t ≥ 0where B ={Bt : t ≥ 0} is a standard Brownian motion It is a trivial exercise
to verify that X has stationary independent process with continuous paths as
a consequence of the fact that B does
Lemma 1.7 (Frullani integral) For all α, β > 0 and z ∈ C such that
ℜz ≤ 0 we have
1(1− z/α)β = e−R0∞(1−e zx )βx−1e−αxdx
Trang 14To see how this lemma helps note that the L´evy-Khintchine formula for agamma distribution takes the form
According to Theorem 1.6 there exists a L´evy process whose Khintchine formula is given by Ψ , the so called gamma process
L´evy-Suppose now that X = {Xt : t ≥ 0} is a gamma process Stationary pendent increments tells us that for all 0≤ s < t < ∞, Xt = Xs+ eXt −s wheree
inde-Xt −s is an independent copy of Xt −s The fact that the latter is strictly itive with probability one (on account of it being gamma distributed) impliesthat Xt > Xs almost surely Hence a gamma process is an example of a L´evyprocess with almost surely non-decreasing paths (in fact its paths are strictlyincreasing) Another example of L´evy a process with non-decreasing paths
pos-is a compound Popos-isson process where the jump dpos-istribution F pos-is supported
on (0,∞) Note however that a gamma process is not a Poisson processes ontwo counts Firstly its L´evy measure has infinite total mass unlike the L´evymeasure of a compound Poisson process which is necessarily finite (and equal
to the arrival rate of jumps) Secondly, whilst a compound Poisson processwith positive jumps does have paths which are almost surely non-decreasing,
it does not have paths which are almost surely strictly increasing
L´evy processes whose paths are almost surely non-decreasing (or simplynon-decreasing for short) are called subordinators We shall return to a formaldefinition of this subclass of processes in Chapter 2
1.2.5 Inverse Gaussain processes
Suppose as usual that B = {Bt : t ≥ 0} is a standard Brownian motion.Define the first passage time τs = inf{t > 0 : Bt + bt > s}, that is, the firsttime a Brownian motion with linear drift b > 0 crosses above level s Recallthat τs is a stopping time1 with respect to the filtration {Ft : t ≥ 0} where
Ft is generated by {Bs : s ≤ t} Otherwise said, since Brownian motion hascontinuous paths, for all t≥ 0,
{τs ≤ t} = [
s ∈[0,t]∩Q
{Bs > s}
definition, the random time τ is a stopping time with respect to the filtration {G t : t ≥ 0} if for all t ≥ 0,
{τ ≤ t} ∈ G t
Trang 151.2 Some examples of L´evy processes 9
and hence the latter belongs to the sigma algebra Ft
Recalling again that Brownian motion has continuous paths we know that
Bτ s + bτs = s almost surely From the Strong Markov Property2, it is knownthat {Bτ s +t − s : t ≥ 0} is equal in law to B and hence for all 0 ≤ s < t,
τt = τs+ eτt −s
where eτt −s is an independent copy of τt −s This shows that the process τ :={τt : t ≥ 0} has stationary independent increments Continuity of the paths
of {Bt + bt : t ≥ 0} ensures that τ has right continuous paths Further, it
is clear that τ has almost surely non-decreasing paths which guarantees itspaths have left limits as well as being yet another example of a subordinator.According to its definition as a sequence of first passage times, τ is also thealmost sure right inverse of the path of the graph of {Bt+ bt : t ≥ 0} Fromthis τ earns its title as the inverse Gaussian process
According to the discussion following Theorem 1.3 it is now immediatethat for each fixed s > 0, the random variable τs is infinitely divisible Itscharacteristic exponent takes the form
Ψ (θ) = s(p
−2iθ + b2− b)for all θ ∈ R and corresponds to a triple a = −2sb−1Rb
0(2π)−1/2e−y2/2dy,
σ = 0 and
Π(dx) = s√ 1
2πx3e−b2 x2 dxsupported on (0,∞) The law of τs can also be computed explicitly as
expo-Y1+ + Yn = anY + bn (1.5)where Y1, , Yn are independent copies of Y , an > 0 and bn ∈ R By sub-tracting bn/n from each of the terms on the left hand side of (1.5) one sees
2
The Strong Markov Property will dealt with in more detail for a general L´evy process in Chapter 3.
Trang 16in particular that this definition implies that any stable random variable isinfinitely divisible It turns out that necessarily an = n1/α for α ∈ (0, 2]; seeFeller (1971), Section VI.1 In that case we refer to the parameter α as theindex A smaller class of distributions are the strictly stable distributions Arandom variable Y is said to have a strictly stable distribution if it observes(1.5) but with bn = 0 In that case, we necessarily have
Y1+ + Yn = n1/αY (1.6)The case α = 2 corresponds to zero mean Gaussian random variables and isexcluded in the remainder of the discussion as it has essentially been dealtwith in the last but one example
Stable random variables observing the relation (1.5) for α ∈ (0, 1) ∪ (1, 2)have characteristic exponents of the form
Ψ (θ) = c|θ|α(1− iβ tan πα
2 sgnθ) + iθη (1.7)where β ∈ [−1, 1], η ∈ R and c > 0 Stable random variables observing therelation (1.5) for α = 1, have characteristic exponents of the form
Ψ (θ) = c|θ|(1 + iβ2
πsgnθ log|θ|) + iθη (1.8)where β ∈ [−1, 1] η ∈ R and c > 0 Here we work with the definition ofthe sign function sgnθ = 1(θ>0) − 1(θ<0) To make the connection with theL´evy-Khintchine formula, one needs σ = 0 and
Π (dx) =
c1x−1−αdx x ∈ (0, ∞)
c2|x|−1−αdx x∈ (−∞, 0) (1.9)where c = c1+c2, c1, c2 ≥ 0 and β = (c1−c2)/(c1+c2) if α ∈ (0, 1)∪(1, 2) and
c1 = c2 if α = 1 The choice of a∈ R is then implicit Exercise 1.4 shows how
to make the connection between Π and Ψ with the right choice of a (whichdepends on α) Unlike the previous examples, the distributions that lie behindthese characteristic exponents are heavy tailed in the sense that the tails oftheir distributions decay slowly to zero so that they only have moments strictlyless than α The value of the parameter β gives an indication of asymmetry
in the L´evy measure and likewise for the distributional asymmetry (althoughthis latter fact is not immediately obvious) The densities of stable processesare known explicitly in the form of convergent power series See Sato (1999)and Samorodnitsky and Taqqu (1994) for further details of all the facts given
in this paragraph With the exception of the defining property (1.6) we shallgenerally not need detailed information on distributional properties of stableprocesses in order to proceed with their fluctuation theory This explains thereluctance to give further details here
Two examples of the aforementioned power series that tidy up to morecompact expressions are Cauchy distributions, corresponding to α = 1 and
Trang 171.2 Some examples of L´evy processes 11
β = 0, and stable-12 distributions, corresponding to β = 1 and α = 1 In theformer case, Ψ (θ) = c|θ| for θ ∈ R and its law is given by
µ (dx) = c
π
1((x− η)2+ c2)dx (1.10)for x ∈ R In the latter case, Ψ(θ) = c|θ|1/2(1− isgnθ) for θ ∈ R and its law
of parameters Further, from the definition of its characteristic exponent it
is clear that at each fixed time the α-stable process will have distributionS(ct, α, β, η)
In this text, we shall henceforth make an abuse of notation are refer to anα-stable process to mean a L´evy process based on a strictly stable distribution
Necessarily this means that the associated characteristic exponent takesthe form
Ψ (θ) =
c|θ|α(1− iβ tanπα2 sgnθ) for α ∈ (0, 1) ∪ (1, 2)
c|θ| + iη for α = 1where the parameter ranges are as above The reason for the restriction tostrictly stable distribution is essentially that we shall want to make use of thefollowing fact If {Xt : t ≥ 0} is an α-stable process, then from its character-istic exponent (or equivalently the scaling properties of strictly stable randomvariables) we see that Xt has the same distribution as t1/αX1 for each t > 0.1.2.7 Other examples
There are many more known examples of infinitely divisible distributions (andhence L´evy processes) Of the many known proofs of infinitely divisibility forspecific distributions, most of them are non trivial, often requiring intimateknowledge of special functions A brief list of such distributions might in-clude generalized inverse Gaussian (see Good (1953) and Jørgensen (1982)),truncated stable (see Tweedie (1984), Hougaard (1986) and Koponen (1995),Boyarchenko and Levendorskii (2002) and Carr et al (2003)), generalized hy-perbolic (see Halgreen (1979)), Mexiner (see Schoutens and Teugels (1998)),Pareto (see Steutel (1970) and Thorin (1977)), F -distributions (see Ismail and
Trang 18Kelker (1979)), Gumbel (see Johnson and Kotz (1970) and Steutel (1973)),Weibull (see Johnson and Kotz (1970) and Steutel (1970)), lognormal (seeThorin (1977a)) and Student t-distribution (see Grosswald (1976) and Ismail(1977)).
Despite being able to identify a large number of infinitely divisible butions and hence associated L´evy processes, it is not clear at this point whatthe paths of L´evy processes look like The task of giving a mathematicallyprecise account of this lies ahead in Chapter 2 In the mean time let us makethe following informal remarks concerning paths of L´evy processes
distri-Exercise 1.1 shows that a linear combination of a finite number of pendent L´evy processes is again a L´evy process It turns out that one mayconsider any L´evy process as an independent sum of a Brownian motion withdrift and a countable number of independent compound Poisson processeswith different jump rates, jump distributions and drifts The superpositionoccurs in such a way that the resulting path remains almost surely finite atall times and, for each ε > 0, the process experiences at most a countablyinfinite number of jumps of magnitude ε or less with probability one and analmost surely finite number of jumps of magnitude greater than ε over allfixed finite time intervals If in the latter description there are always an al-most surely finite number of jumps over each fixed time interval then it isnecessarily and sufficient that one has the linear independent combination of
inde-a Browniinde-an motion with drift inde-and inde-a compound Poisson process Depending onthe underlying structure of the jumps and the presence of a Brownain motion
in the described linear combination, a L´evy process will either have paths ofbounded variation on all finite time intervals or paths of unbounded variation
on all finite time intervals
Below we include four computer simulations to give a rough sense of howthe paths of L´evy processes With the exception of Figure 1.2.7 we have alsoincluded a plot of the magnitude of the jumps as they occur Figures 1.2.7and 1.2.7 are examples of L´evy processes which have paths of unboundedvariation For comparison we have included simulations of the paths of acompound Poisson process and a Brownian motion in Figures 1.2.7 and 1.2.7respectively The reader should be warned however that computer simulationscan ultimately only depict finite activity in any given path All pictures werekindly produced by Prof W Schoutens for the purpose of this text One mayconsult his book, Schoutens (2003), for an indication of how these simulationshave been made
Trang 191.2 Some examples of L´evy processes 13
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
−0.02 0 0.02 0.04 0.06 0.08 0.1 0.12
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
−0.01
−0.005 0 0.005 0.01
Fig 1.1 The upper diagram depicts a normal inverse Gaussian process This is a L´evy process whose characteristic exponent takes the form Ψ (θ) =
dia-gram is a plot of the jumps of the path given in the upper diadia-gram Theoretically,
an inverse Gaussian process experiences a countably infinite number of jumps over each finite time horizon with probability one.
Fig 1.2 The upper diagram depicts a Mexiner process This is a L´evy process whose characteristic exponent takes the form Ψ (θ) = − log
„
“ cos(β/2) cosh((αθ−iβ)/2
” 2δ «
− iµθ where α > 0, |β| < π, δ > 0 and µ ∈ R The lower diagram is a plot of the jumps
of the path given in the upper diagram The Meixner process is another example of
a L´evy process which, in theory, experiences a countably infinite number of jumps over each finite time horizon with probability one.
Trang 200 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0
5 10 15 20 25
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
−4
−2 0 2 4 6 8
Fig 1.3 The upper diagram depicts a compound Poisson process and therefore necessarily has a finite number of jumps over each finite time horizon with probability one The lower diagram is a plot of the jumps of the path given in the upper diagram.
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
−1
−0.5 0 0.5 1 1.5
Fig 1.4 A plot of a path of Brownian motion.
1.3 L´ evy processes in classical applied probability models
In this section we shall introduce some classical applied probability modelswhich are structured around basic examples of L´evy processes This sectionprovides a particular motivation for the study of fluctuation theory which fol-lows in subsequent chapters (There are of course other reasons for wanting
to study fluctuation theory of L´evy processes) With the right understanding
of particular features of the models given below in terms of the path
Trang 21prop-1.3 L´evy processes in classical applied probability models 15
erties of the underlying L´evy processes, much richer generalizations of theaforementioned models may be studied for which familiar and new pheonomamay be observed At different points later on in this text we shall return tothese models and reconsider these phenomena in light of the theory that hasbeen presented along the way In particular all of the results either stated oralluded to below shall be proved in greater generality in later chapters.1.3.1 Cram´er-Lundberg risk process
Consider the following model of the revenue of an insurance company as aprocess in time proposed by Lundberg (1903) The insurance company collectspremiums at a fixed rate c > 0 from its customers At times of a Poissonprocess, a customer will make a claim causing the revenue to jump downwards.The size of claims are independent and identically distributed If we call Xt
the revenue of the company at time t, then the latter description amounts to
Financial ruin in this model (or just ruin for short) will occour if therevenue of the insurance company is less than or equal to zero Since this willhappen with probability one if P(lim inft ↑∞Xt = −∞) = 1, an additionalassumption imposed on the model is that
Trang 22τ0− := inf{t > 0 : Xt < 0} and Xτ−
0 on {τ0− < ∞}
when the process X drifts to infinity
The following classic result links the probability of ruin to the conditionaldistribution
Theorem 1.9 In the Cram´er-Lundberg model (with λµ/c < 1), ρ = λµ/cand
η(x) = 1
µ
Z x 0
F (y,∞)dy (1.12)where F is the distribution of ξ1
This result can be derrived by a classical path analysis of random walks Thisanalysis gives some taste of the general theory of fluctuations of L´evy processesthat we shall spend quite some time with in this book The proof of Theorem1.9 can be found in Exercise 1.8
The Pollaczeck-Khintchine formula together with some additional tions on F gives rise to an interesting asymptotic behaviour of the probability
assump-of ruin Specifically we have the following result
Theorem 1.10 Suppose that λµ/c < 1 and there exists a 0 < ν < ∞ suchthat E e−νX1
= 1, then
Px τ0− < ∞ ≤ e−νxfor all x > 0 If further, the distribution of F is non-lattice, then
lim
x ↑∞eνxPx τ0− < ∞ =
λν
Trang 231.3 L´evy processes in classical applied probability models 17
In the above theorem, the parameter ν is known as the Lundberg exponent.See Cram´er (1994a,b) for a review of the appearance of these results
In more recent times, authors have extended the idea of modelling withcompound Poisson processes with drift and moved to more general class ofL´evy processes for which the measure Π is supported exclusively on (−∞, 0)and hence processes for which there are no positive jumps See for exampleHuzak et al (2004, 2004a), Chan (2004) and Kl¨uppelberg et al (2004) It turnsout that, working with this class of L´evy processes preserves the idea that thethe revenue of the insurance company is the aggregate superposition of lots
of independent claims sequentially through time offset against a deterministicincreasing process corresponding to the accumulation of premiums; even whenthere are an almost surely infinite number jumps downwards (claims) in anyfixed time interval We shall provide a more detailed interpretation of thelatter class in Chapter 2 In Chapter 7, amongst other things, we shall alsore-examine the Pollaczeck-Khintchine formula and the asymptotic probability
of ruin given in Theorem 1.10 in the light of these generalized risk models
1.3.2 The M/G/1 queue
Let us recall the definition of the M/G/1 queue Customers arrive at a servicedesk according to a Poisson process and join a queue Customers have servicetimes which are independent and identically distributed Once served, theyleave the queue
The workload, Wt, at each time t ≥ 0, is defined to be the time it willtake a customer who joins the back of the queue at that moment to reach theservice desk; that is to say the amount of processing time remaining in thequeue at time t Suppose that at an arbitrary moment, which we shall calltime zero, the server is not idle and the workload is equal to w > 0 On theevent that t is before the first time the queue becomes empty, we have that
a new arrival incurs a jump in W which has distribution F The process ceeds as the compound Poisson process described above until the queue nextempties and so on
Trang 24pro-The workload is clearly not a L´evy process as it is impossible for Wt : t ≥ 0
to decrease in value from the state zero where as it can decrease in value fromany other state x > 0 However, it turns out that it is quite easy to link theworkload to a familiar functional of a L´evy process which is also a Markovprocess Specifically, suppose we define Xt equal to precisely the same theL´evy process given in the Cram´er-Lundberg risk model with c = 1 and x = 0,then
Wt = (w∨ Xt)− Xt, t≥ 0where the process X := {Xt : t ≥ 0} is the running supremum of X, hence
Xt = supu≤tXu Whilst it is easy to show that the pair (X, X) is a Markovprocess, with a little extra work it can be shown that W is a Strong MarkovProcess (this is dealt with later in Exercise 3.2) Clearly then, under P, theprocess W behaves like −X until the random time
τw+ := inf{t > 0 : Xt > w}
The latter is in fact a stopping time since {τ+
w ≤ t} = {Xt ≥ w} and thelatter belongs to the filtration generated by the process X At the time τ+
w,the process W ={Wt : t≥ 0} first becomes zero and on account of the StrongMarkov Property and the lack of memory property, it remains so for a period
of time which is exponentially distributed with parameter λ since during thisperiod w∨Xt = Xt = Xt At the end of this period, X makes another negativejump distributed according to F and hence W makes a positive jump withthe same distribution and so on thus matching the description in the previousparagraph; see Figure 1.5
Note that this description still makes sense when w = 0 in which case for
an initial period of time which is exponentially distribution W remains equal
to zero until X first jumps (corresponding to the first arrival in the queue).There are a number of fundamental points of interest concerning both localand global behavioural properties of the M/G/1 queue Take for example thetime it takes before the queue first empties; in other words τw+ It is clearfrom a simple analysis of the paths of X and W that the latter is finite withprobability one if the underlying process X drifts to infinity with probabilityone Using similar reasoning to the previous example, with the help of theStrong Law of Large Numbers it is easy to deduce that this happens when
λµ < 1 Another common situation of interest in this model corresponds tocase that the server is only capable of dealing with a maximum workload of
z units of time The first time the workload exceeds the buffer level z
σz := inf{t > 0 : Wt > z}therefore becomes of interest In particular the probability of{σz < τ+
w} whichcorresponds to the event that the workload exceeds the buffer level before theserver can complete a busy period
The following two theorems give some classical results concerning the idletime of the M/G/1 queue and the stationary distribution of the work load
Trang 251.3 L´evy processes in classical applied probability models 19
w Process X
w
Process W 0
0
Fig 1.5 Sample paths of X and W
Roughly speaking they say that when there is heavy traffic (λµ > 1) eventuallythe queue never becomes empty and the workload grows to infinity and thetotal time that the queue remains empty is finite with a particular distribution.Further, when there is light traffic (λµ < 1) the queue repeatedly becomesempty and the total idle time grows to infinity whilst the workload processconverges in distribution At the critical value λµ = 1 the workload grows toarbitrary large values but none the less the queue repeatedly becomes emptyand the total idle time grows to infinity Ultimately all these properties are areinterpretation of the long term behaviour of a special class of reflected L´evyprocesses
Theorem 1.11 Suppose that W is the workload of an M/G/1 queue witharrival rate λ, service distribution F having mean µ Define the total idletime
Trang 26P (I ∈ dx|W0 = w) = (1− e−θ∗w)δ0(dx) + θ∗e−θ∗(w+x)dx(ii) If λµ≤ 1 then I is infinite with probability one.
Note that the function ψ given above is nothing more than the Laplaceexponent of the underlying L´evy process
Theorem 1.12 Let W be the same as in Theorem 1.11
(i) Suppose that λµ < 1 Then for all w ≥ 0 the virtual waiting time has astationary distribution,
F (y,∞)dy and ρ = λµ
(ii) If λµ≥ 1 then lim supt↑∞Wt = ∞ with probability one
Some of the conclusions in the above two theorems can already be tained with basic knowledge of compound Poisson processes Theorem 1.11
ob-is proved in Exercob-ise 1.9 and gives a scent of some more of the fluctuationtheory that shall be touched upon later on in this text The remarkable sim-ilarity between Theorem 1.12 part (i) and the Pollaczeck-Khintchine formula
is of course no coincidence The principles which are responsible for the ter two results are embedded within the general fluctuation theory of L´evyprocesses Indeed we shall revisit Theorems 1.11 and 1.12 but for more gen-eral versions of the workload process of the M/G/1 queue known as generalstorage models Such generalizations involve working with a general class ofL´evy process with no positive jumps (that is Π(0,∞) = 0) and defining asbefore Wt = (w ∨ Xt)− Xt When there are an infinite number of jumps
lat-in each flat-inite time lat-interval the latter process may be thought of modelllat-ing
a processor which deals with an arbitrarily large number of small jobs andoccasional large jobs The precise interpretation of such a generalized M/G/1workload process and issues concerning the distribution of the busy period,the stationary distribution of the workload, time to buffer overflow and otherrelated quantities will be dealt with later on in Chapters 2, 4 and 8
Trang 27(i) Show that Γp is infinitely divisible.
(ii) Show that SΓp is infinitely divisible
1.3 (Proof of Lemma 1.7) In this exercise we derive the Frullani identity.(i) Show for any function f such that f′ exists and is continuous and f (0) and
f (∞) are finite, that
where b > a > 0
(ii) By choosing f (x) = e−x, b = α− z where z < 0 and a = α > 0, show that
1(1− z/α)β = e−R0∞(1 −ezx)βxe−αxdx
and hence by analytic extension show that the above identity is still validfor all z ∈ C such that ℜz < 0 Taking limits as ℜz ↓ 0 show that theidentity is still valid for ℜz = 0
1.4 Establishing formulae (1.7) and (1.8) from the L´evy measure given in(1.9) is the result of a series of technical manipulations of special integrals
In this exercise we work through them In the following text we shall use thegamma function Γ (z), defined by
Γ (z) =
Z ∞
0
tz−1e−tdt
for z > 0 Note the gamma function can also be analytically extended so that
it is also defined on R\{0, −1, −2, } (see Lebedev (1972)) Whilst the cific definition of the gamma function for negative numbers will not play animportant role in this exercise, the following two facts which can be derivedfrom it will For z ∈ R\{0, −1, −2, } the gamma function observes the re-cursion Γ (1 + z) = zΓ (z) and the value z = 1/2 which allows a computablevalue of the gamma function, namely Γ (1/2) =√
spe-π
Trang 28(i) Suppose that 0 < α < 1 Prove that for u≥ 0
as well as the complex conjugate of both sides being equal Deduce (1.7)
by considering the integral
(iii) Now suppose that 1 < α < 2 Integrate (1.15) by parts to reach
Z ∞
0
(eir − 1 − ir)r−α−1dr = Γ (−α)e−iπα/2
Consider the above integral for z = ξθ where ξ = ±1 and θ ∈ R anddeduce the identity (1.7) in a similar manner to the proof in (i) and (ii).1.5 Prove for any θ ∈ R that
exp{iθXt+ tΨ (θ)}, t ≥ 0
is a martingale where {Xt : t ≥ 0} is a L´evy process with characteristicexponent Ψ
Trang 291.3 Exercises 23
1.6 In this exercise we shall work out in detail the features of the InverseGaussian process discussed earlier on in this chapter Recall that τ ={τs : s ≥
0} is a non-decreasing L´evy process defined by τs = inf{t ≥ 0 : Bt + bt > s},
s ≥ 0, where B = {Bt : t≥ 0} is a standard Brownain motion and b > 0.(i) Argue along the lines of Exercise 1.5 to show that for each λ > 0
(ii) Defining the measure Π(dx) = (2πx3)−1/2e−xb2/2dx on x > 0, check using(1.15) from Exercise 1.4 that
Trang 301.8 (Proof of Theorem 1.9) As we shall see in this exercise, the proof
of Theorem 1.9 follows from the proof of a more general result given by theconclusion of parts (i)–(v) below for random walks
(i) Suppose that S = {Sn : n ≥ 0} is a random walk with S0 = 0 andjump distribution µ By considering the variables Sk∗ := Sn − Sn−k for
k = 0, 1, , n and noting that the joint distributions of (S0, , Sn) and(S0∗, , Sn∗) are identical, show that for all y > 0 and n ≥ 1
P (Sn ∈ dy and Sn > Sj for j = 0, , n− 1)
= P (Sn ∈ dy and Sj > 0 for j = 1, , n)
[Hint: it may be helpful to draw a diagram of the path of the rirst n steps
of S and to rotate it about 180 degrees.]
(ii) Next prove that for x≤ 0 and n ≥ 1
P (S1 > 0, , Sn > 0, Sn+1 ∈ dx)
=Z
(0,∞)
P (S1 > 0, , Sn > 0, Sn ∈ dy)µ(dx − y) (1.16)(iii) Define
T0− = inf{n > 0 : Sn ≤ 0} and T0+ = inf{n > 0 : Sn > 0}
By summing (1.16) over n show that for x ≤ 0
is expectation with respect to the random variable eβ
Trang 311.3 Exercises 25
(v) Since upward jumps are exponentially distributed in this random walk,use the lack of memory property to reason that
V (dy) = δ0(dy) + βdy
Hence deduce from part (iii) that
(i) Show by analytic extension from the L´evy-Khintchine formula or otherwisethat E(eθX t) = eψ(θ)t for all θ ≥ 0 where
(ii) Show that {eθX t −ψ(θ)t : t ≥ 0} is a martingale and hence so is {eθ∗Xt ∧τ x+ :
t ≥ 0} where τ+
x = inf{t > 0 : Xt > x}, x > 0 and θ∗ is the largest rootdescribed in the previous part of the question Show further that
E(X∞ > x) = e−θ∗xfor all x > 0
(iii) Show that for all t≥ 0
Z t 0
Trang 33The main aim of this chapter is to establish a rigourous understanding ofthe structure of the paths of L´evy processes The way we shall do this is toprove the assertion in Theorem 1.6 that given any characteristic exponent
Ψ belonging to an infinitely divisible distribution, there exists a L´evy processwith the same characteristic exponent This will be done by establishing the socalled L´evy-Itˆo decomposition which describes the structure of a general L´evyprocess in terms of three independent auxilliary L´evy processes, each withdifferent types of path behaviour In doing so it will be necessary to digresstemporarily into the theory of Poisson point processes and associated squareintegrable martingales Understanding the L´evy-Itˆo decomposition will allow
us to distinguish a number of important general subclasses of L´evy processesaccording to their path type path type The chapter is concluded with adiscussion of the interpretation of the L´evy-Itˆo decomposition in the context
of some of the applied probability models mentioned in the previous chapter
2.1 The L´ evy-Itˆ o decomposition
According to Theorem 1.3, any characteristic exponent Ψ belonging to an finitely divisible distribution can be written, after some simple reorganization,
in-in the form
Ψ (θ) =
iaθ + 1
R \{0}(1∧ x2)Π(dx) < ∞ Note in particular that the latter condition on Π
Trang 34implies that it is a measure which is finite on intervals which are boundedand contained in the interior of R\{0} as well as having the property thatΠ(R\(−1, 1)) ∈ [0, ∞) In the case that Π(R\(−1, 1)) = 0 one should think
of the second bracket in (2.1) as absent Call the three brackets in (2.1) Ψ(1),
Ψ(2) and Ψ(3) The essence of the proof boils down to showing that Ψ(1),
Ψ(2) and Ψ(3) all correspond to the characteristic exponents of three differenttypes of L´evy processes Therefore Ψ may be considered as the characteristicexponent of the independent sum of these three L´evy processes which is again
a L´evy process (cf Exercise 1.1) Indeed, as we have already seen in Chapter
1, Ψ(1) and Ψ(2) correspond respectively to a scaled Brownian motion withdrift, X(1) = {Xt(1) : t ≥ 0} where
Xt(1) = σBt− at, t ≥ 0 (2.2)and a compound Poisson process, say X(2) = {Xt(2) : t ≥ 0}, where,
The proof of existence of a L´evy process with characteristic exponent given
by (2.1) thus boils down to showing that the existence of a L´evy process, X(3),whose characteristic exponent is given by Ψ(3) Noting that
Trang 352.2 Poisson point processes 29
The identification of a L´evy processes, X as the independent sum of cesses X(1), X(2) and X(3) is attributed to L´evy (1954) and Itˆo (1942) (seealso Itˆo (2004)) and is thus known as the L´evy-Itˆo decomposition Formallyspeaking and in a little more detail we quote the L´evy-Itˆo decomposition inthe form of a theorem
pro-Theorem 2.1 (L´evy-Itˆo decomposition) Given any a ∈ R, σ ≥ 0 and ure Π supported in R\{0} satisfying
meas-Z
R \{0}
(1∧ x2)Π(dx) <∞,
there exists a probability space on which three independent L´evy processes exist,
X(1), X(2) and X(3) where X(1) is a scaled Brownian motion with drift given
by (2.2), X(2) is a compound Poisson process given by (2.3) and X(3) is asquare integrable martingale with an almost surely countable number of jumps
on each finite time interval which are of magnitude less than unity and withcharacteristic exponent given by Ψ(3) By taking X = X(1) + X(2) + X(3) wesee that the conclusion of Theorem 1.6 holds, that there exists a probabilityspace on which a L´evy process is defined with characteristic exponent
2.2 Poisson point processes
Poisson point processes turn out to be the right mathematical mechanism todescribe the jump structure embedded in any L´evy process Before engaging
in an abstract study of Poisson point processes however, we give a rough idea
of how they are related to jump structure of L´evy processes by consideringthe less complicated case of a compound Poisson process
Suppose then that X = {Xt : t ≥ 0} is a compound Poisson process with
a drift taking the form
dis-i ≥ 0} where {Ti : i ≥ 0} are the times of arrival of the Poisson process N.See Figure 2.1
Suppose now that we pick any set in A∈ B[0, ∞) × B(R\{0}) Define
Trang 36ΥCPP
Xt
Fig 2.1 The initial period of a sample path of a compound Poisson process with
Trang 37dis-2.2 Poisson point processes 31
ordered in time In particular, for any A∈ B[0, t]×B(R\{0}), the random able N (A) conditional on the event {nt = n} is a Binomial random variablewith probability of success R
vari-At−1dt × F (dx) A generalization of this latterstatement for the k-tuple (N (A1), , N (Ak)), where A1, , Ak are mutuallydisjoint and chosen from in B[0, t] × B(R), is the following however Supposethat A0 = {[0, t] × R}\{A1 ∪ ∪ Ak},Pki=1ni ≤ n, n0 = n− Pki=1ni and
P(N (A1) = n1, , N (Ak) = nk)
n≥Σ k i=1 n i
This lemma shows that the process ΥCPP fulfills the following definition
of a Poisson point process
Definition 2.3 (Poisson point processes) In what follows we shall sume that (S,S, η) is an arbitrary σ-finite measure space such that singletons
as-of S are measurable (that is to say {x} ∈ S for each x ∈ S) Suppose that theprobability space (Ω,F, P) is such that Υ : Ω → S∞, where S∞ is the space
Trang 38of countable subsets of S, has the property that for each A ∈ S, the variables
N (A) : Ω → {0, 1, 2 }, defined by
N (A) [ω] := #(Υ (ω)∩ A)(the number of points of Υ in A), are all F-measurable That is to say, foreach n = 0, 1, 2, we have that
{ω : N (A) [ω] = n} ∈ F
We shall supress the dependency of Υ and N on ω for convenience A Poissonpoint process on (S,S, η) defined on the probability space (Ω, F, P) is a randomcountable set Υ of S such that
(i) the measure η is non-atomic,
(ii) for mutually disjoint A1, , An in S, the variables N(A1), , N (An) areindependent,
(iii) for each A ∈ S, N(A) is Poisson distributed with parameter η(A) (here
we allow 0 ≤ η(A) ≤ ∞)
For short, we refer to this process as a Poisson point Υ process on S withintensity measure η
Regarding condition (i) in the above definition, suppose that there exists
an x ∈ S such that η({x}) > 0 Then from the third assumption, it wouldfollow that
P (N ({x}) ≥ 2) = 1 − e−η({x})− η({x})e−η({x}) > 0thus allowing for multiple points at singletons For technical reasons (con-cerning the existence of Poisson processes) this is an undesirable propertyand hence is ruled out In the third condition we note that if η(A) = 0 then
it is meant that N (A) = 0 with probability one and if η(A) = ∞ then N(A)
is infinite with probability one
In the case of the process ΥCPP we see that S = [0,∞) × {R\{0}} and
dη = λdt× dF which is clearly non-atomic Note also that by construction
of the compound Poisson process on some probability space (Ω,F, P), all ofthe random variables 1((Ti,ξi)∈A) for each A ∈ B[0, ∞) × B(R\{0}) are F-measurable and hence so so are the variables N (A) Finally we see in thisexample why it is desirable that the measure dη = dt × dF is atomless ashaving two points at the same time would not fit well with the description of
ΥCPP being the process of jumps of a compound Poisson process with drift.Clearly we need to prove that a Poisson point process as defined aboveexists in general This is done in the next theorem, the proof of which goesvery much along the justification that the points of process ΥCPP form aPoisson point process
Theorem 2.4 There exists a Poisson point process Υ on S with intensitymeasure η
Trang 392.2 Poisson point processes 33
Proof First suppose that S is such that η(S) < ∞ There exists a standardconstruction of an infinite product space, say (Ω,F, P ) on which the inde-pendent random variables
N and {υ1, υ2, }are collectively defined such thatN has a Poisson distribution with parameterη(S) and each of the variables υi have distribution η(dx)/η(S) on S Define
Υ = {υ1, υ2, , υN}and for each A∈ S write
so thatN = N(S) Note that since dt×η(dx) has no atoms, the set Υ consists
of almost surely distinct points (Strictly speaking this needs a formal proofbut we take it for granted here and otherwise refer to Kingman (1993), p14)
As for each A ∈ S and i ≥ 1, the random variables 1(υ i ∈A) areF-measurable,then so are the random variables N (A)
When presented with mutually disjoint sets of S, say A1, , Ak, a culation identical to the one given in the proof of Lemma 2.2 shows againthat
of sets B1, B2, in S such that 0 < η(Bi) < ∞ for each i ≥ 1 Definethe measures ηi(·) = η(· ∩ Bi) for each i ≥ 1 The first part of this proofshows that for each i ≥ 1 there exists some probability space (Ωi,Fi, Pi) onwhich we can define a Poisson point process, say Υi, in (Bi,S ∩ Bi, ηi) where
S ∩ Bi = {A ∩ Bi : A ∈ S} (the reader should verify easily that S ∩ Bi isindeed a sigma algebra on Bi which contains the singeltons of Bi) We shallnow show that
Trang 40The points in Υ are P-almost surely distinct since by construction they aredistinct on each Bi and the Bi are disjoint Define for each A∈ S the randomvariable
{Ni(Aj ∩ Bi) : i = 1, 2, and j = 1, , k}
is also an independent sequence of variables
From the construction of the Poisson point process, the following corollaryshould be clear
Corollary 2.5 Suppose that Υ is a Poisson point process on S with intensity
η Then for each A ∈ S, Υ ∩ A is again a Poisson point process on S ∩ A withintensity measure η(· ∩ A)
2.3 Functionals of Poisson point processes
Suppose as in the previous section that Υ is a Poisson point process over Swith intensity η Note that N :S → [0, ∞], is a nothing more than a randomcounting measure on (S,S) as it satisfies the usual axioms of measures Namely
N (∅) = 0 and for disjoint sets A1, A2, in S
ex-In this section, we aim to study functionals of the form