1. Trang chủ
  2. » Kinh Doanh - Tiếp Thị

independent and stationary sequences of random variables

438 858 0
Tài liệu đã được kiểm tra trùng lặp

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Tiêu đề Independent and Stationary Sequences of Random Variables
Tác giả I. A. Ibragimov, Yu. V. Linnik
Người hướng dẫn J. F. C. Kingman, Professor
Trường học University of Leningrad
Chuyên ngành Probability and Statistics
Thể loại Research Paper
Năm xuất bản 1971
Thành phố Leningrad
Định dạng
Số trang 438
Dung lượng 13,94 MB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

LINNIKUniversity of ' Leningrad, Leningrad INDEPENDENT AND STATIONARY SEQUENCES OF RANDOM VARIABLES Edited by PROFESSOR J.. Editor's notePrefaceChapter 1Probability distributions on the

Trang 1

INDEPENDENT AND STATIONARY SEQUENCES

OF RANDOM VARIABLES

Trang 2

I A IBRAGIMOV AND Yu V LINNIK

University of ' Leningrad, Leningrad

INDEPENDENT AND STATIONARY SEQUENCES OF

RANDOM VARIABLES

Edited by

PROFESSOR J F C KINGMAN

University of Oxford, Oxford, U K.

WOLTERS-NOORDHOFF PUBLISHING GRONINGEN

THE NETHERLANDS

12240

N"N

Trang 3

©1971 WOLTERS-NOORDHOFF PUBLISHING GRONINGEN

No part of this book may be reproduced in any form by print, photoprint, microfilm or any other means without written permission from the publisher

Library of Congress Catalog Card No 79 -119886

ISBN 90 01 41885 6

PRINTED IN THE NETHERLANDS

Trang 4

EDITOR'S NOTE

The notation used is substantially that of the original, with a few tions of which the most notable is the use of E rather than M for mathe-matical expectation ; V is used for variance rather than the original D,since the latter might be mistaken for standard deviation The symbol

excep-is used to signal the end of the proof of a theorem or lemma In some placesthe argument has been recast so as to read more smoothly in English,

I hope without violence to the authors' intentions Readers will be familiarwith the 0, o notation, but will perhaps not recognise the symbol B, ,which is used in some chapters to denote a generic bounded quantity Oxford, October 1969

J.F C K

Trang 5

Editor's notePrefaceChapter 1Probability distributions on the real line : infinitely divisible laws

17

1 Probability spaces, conditional probabilities and expectations 17

2 Distributions and distribution functions 19

4 Moments and characteristic functions 24

5 Continuity of the correspondence between distributions and

6 A special theorem about characteristic functions 32

Chapter 2Stable distributions ; analytical properties and domains of attraction 37

2 Canonical representation of stable laws 39

3 Analytic structure of the densities of stable distributions 47

4 Asymptotic formulae for the densitiesp (x ; a, /3) 54

1315

Trang 6

CONTENTS

Chapter 3Refinements of the limit theorems for normal convergence

4 Necessary and sufficient conditions 104

5 The maximum deviation of F„ from 0 111

6 Dependence of the remainder term on n and x

117

Chapter 4Local limit theorems

120

2 Local limit theorems for lattice distributions 121

5 A refinement of the local limit theorems for the case of normal

Chapter 5Limit theorems in Lp spaces

139

2 Domains of attraction of stable laws in the LP metric 141

3 Estimates of II P, - 0 11 P in the case-of normal convergence 146

Chapter 6Limit theorems for large deviations

154

Trang 7

Chapter 7

Richter's local theorems and Bernstein's inequality

160

2 A local limit theorem for probability densities 161

3 Calculation of the integral near a saddle point 166

4 A local limit theorem for lattice variables 167

2 The introduction of auxiliary random variables 172

Chapter 9

Monomial zones of local normal attraction

177

4 Approximation of the characteristic function by a finite Taylor

5 Derivation of the basic integral 184

7

Trang 8

8

CONTENTS

3 Derivation of the fundamental integral 192

4 Application of the method of steepest descents 194

5 Completion of the proof of Theorem 10 1 1 197

Chapter 11Narrow zones of normal attraction

198

1 Classification of narrow zones by the function h 198

4 The necessity of (11 2.2) for Class I 200

5 The sufficiency of (11 2 2) for Class I 201

6 Investigation of the fundamental integral 203

7 More investigation of the fundamental integral 204

10 Completion of the proof of Theorem 11 2.1 211

11 The corresponding integral theorem 212

12 Calculation of the auxiliary limit distribution 214

13 More about the auxiliary limit distribution 215

14 Completion of the proof of Theorem 11 2.2 217

16 The transition to Theorems 11 2.3-5 220

Chapter 12Wide monomial zones of integral normal attraction

226

2 An upper bound for the probability of a large deviation 227

3 Introduction of auxiliary variables 229

5 Derivation of the fundamental formula 232

Trang 9

6 The fundamental integral formula 234

7 Study of the auxiliary integral 235

8 Expansion of R as a Taylor series 236

10 Completion of the proof of sufficiency 240

2 An upper bound for the probability of a large derivation 245

3 Investigation of the basic formula 251

4 Investigation of the fundamental integral 260

5 Investigation of the auxiliary integrals 263

Chapter 15

Approximation of distributions of sums of independent components

by infinitely divisible distributions

9

Trang 10

1 Definition and general properties 284

2 Stationary processes and the associated measure-preserving

3 Hilbert spaces associated with a stationary process 288

4 Autocovariance and spectral functions of stationary processes 291

5 The spectral representation of stationary processes 292

6 The structure of L , and linear transformations of stationary

3 Conditions of weak dependence for Gaussian sequences 310

3 The variance of the integral $ X(t)dt 330

4 The central limit theorem for strongly mixing sequences 333

5 Sufficient conditions for the central limit theorem

340

Trang 11

CONTENTS

11

6 The central limit theorem for functionals of mixing sequences 352

7 The central limit theorem in continuous time

3 The distribution of values of sums of the form E f (2'x) 370

4 Application to the metric theory of continued fractions 374

5 Example of a sequence not satisfying the central limit theorem 384

394

Appendix 2Theorems on Fourier transforms

440

Trang 12

It is difficult to indicate in a short title the contents and methods of attack

of this book, and we seek therefore to do so in this preface The problemsstudied here concern sums of stationary sequences of random variables,including sequences of independent and identically distributed variables More specifically, we are concerned with the distribution function F„ (x)

of the sum Xl + X2+ . + X, where X 1 , X2 , is a stationary sequence

In the independent case, asymptotic analysis of F„ (x) for large n is highlydeveloped, but in the general case much less is known

Most of the methods expounded here can be extended, for example, toproblems in which the X„ are not identically distributed, but the resultsare cumbersome and seem less final, and we therefore restrict ourselves

to the stationary case As well as the problem of summation just outlined,

we include a discussion of some closely related problems of the analyticalstructure of stable laws

The book presupposes a knowledge of the monograph "Limit tions of Sums of Independent Random Variables" by B V Gnedenko and

Distribu-A N Kolmogorov, whose publication in 1949 inspired much of the search we describe

re-Chapters 2-5 treat problems about sums of independent, identicallydistributed random variables not connected with the theory of largedeviations, which occupies Chapters 6-14 In Chapter 15 the problem ofapproximating F„ (x) by infinitely divisible distributions is studied Chap-ters 16-19 are devoted to limit theorems for weakly dependent stationarysequences In Chapter 20 some unsolved problems are formulated

Trang 13

Chapter 1

PROBABILITY DISTRIBUTIONS ON THE REAL LINE

INFINITELY DIVISIBLE LAWS

This chapter is of an introductory nature, its purpose being to indicatesome concepts and results from the theory of probability which are used

in later chapters Most of these are contained in Chapters 1-9 of denko [47], and will therefore be cited without proof

Gne-The first section is somewhat isolated, and contains a series of resultsfrom the foundations of the theory of probability A detailed accountmay be found in [76], or in Chapter I of [31] Some of these will not beneeded in the first part of the book, in which attention is confined toindependent random variables

§ 1 Probability spaces, conditional probabilities and expectations

A probability space is a triple (Q, R, P), where Q is a set of elements w,

R a a-algebra of subsets of Q (called events), and P a measure on tR with

P (Q) = 1 ForE ER,P (E)is called the probability of the event E A randomvariableXis a real-valued measurable function on (Q, a),and the measure

F defined on the Borel sets of the real line R by F (A) = P (X E A) is calledthe distribution of X

Several random variables X1 , X2 , , X„ may be combined in a randomvector X = (X1 , X2 , , XJ, and the measure F (A) = P (X E A) defined onthe Borel sets ofR" is the distribution of X, or the joint distribution of thevariables X 1 , X2 , , X,,

More generally, if T is any set of real numbers, a family of random variables

X (t), t e T, defined on (Q, R, P) is called a random process Conditionsfor the existence of random processes with prescribed joint distributionsare given by Kolmogorov's theorem [76]

A probability space is a special case of a measurable space, and it is

Trang 14

1 X (c)) P (dw) =

J

X dP

is called the expectation of X, and is denoted by the symbol E(X)

I f X is a random vector with values in R" and distribution F, and 0 is aBorel measurable function from R"to R, then 0 (X) is a random variable,and

E O(X)

= J 0

(x) F (dx)

R"

Let C)1 be a a-algebra with R, c R, and let X be a random variable with

E I X I < oo The conditional expectation of X relative to a1 is the randomvariable, denoted by E (XI 1), which is measurable with respect to talland satisfies

(co0A)

Then EQAI tR l ) is called the conditional probability of A relative toand is denoted by P (A IR1) The random variable P (A I UI) is measurablewith respect to R1, and satisfies

lB P(AI R I )dP= P(AB)

Trang 15

We shall state various properties of conditional expectations which will

be needed later (cf [31], Chapter I) If Yand Z are random variables with

E I YI < oc and E IZI < oo, and if Z is measurable with respect to a,, thenwith probability one,

E(ZYI U1) = ZE(Y IUI)

( 1 1 3)

If a-algebras U 1, U 2 satisfy Rl c R2 (=R, then with probability one,

E{E(X IUI R1} = E{X IR1}

§ 2 Distributions and distribution functions

If X is a random variable, its probability distribution is the measure

F (A) = P (X E A)

on the Borel subsets of the real line It is well known that F is uniquelydetermined by the corresponding distribution function Fdefined byF(x) =F((-co,x))=P(X<x)

In what follows, no distinction will be made between FandF,and we shallspeak, for instance, of a random variable X having distribution F (x)

A probability distribution F is called continuous if the measure F is solutely continuous with respect to Lebesgue measure, i e if

A probability distribution F is said to be discrete if it is concentrated onsome countable set {x k} If p k = P(X = x k), then

Trang 16

20

PROBABILITY DISTRIBUTIONS ON THE REAL LINE

Chap I

F (A) = I p k , F (x) _ I p k

xkEA

F(x) = a1 F1(x) +a2 F2(x) +a 3F3 (x) ,into continuous, singular and discrete components Every distribution function F is non-decreasing, left-continuous, and haslim F(x)=O,

lim F (x) = 1

X_ - 00

x- 00

Conversely, every function satisfying these conditions is a distributionfunction, since we may take S2=R, to the a-algebra of Borel sets, P theLebesgue-Stieltjes measure determined by P { [a, b) } = F (b) - F (a), and

X (w) = w

Trang 17

distribu-(1) Convergence in variation Define the distance p, (F, G) between twodistributions F and G by

p 1 (F, G) = sup J F (A) - G (A) I ,

( 1 3 1)where the supremum is taken over all Borel sets A A sequence of dis-tributions F,converges in variation to a distribution F if p 1 (F", F)-*0 It isclear that this mode of convergence can be expressed in terms of distri-bution functions : p 1 (F, G) is one-half the total variation of F (x) - G (x)

For continuous distributions

Trang 18

22 PROBABILITY DISTRIBUTIONS ON THE REAL LINE

Chap 1

p 1 (F, G)=I IF(x+0)-F(x)-G(x+0)+G(x)I,

xthe summand being zero except at a countable number of values of x

(2) Strong convergence Suppose that in (1 3.1) we take the supremum,not over all Borel sets A, but only over intervals d This gives a new distance

p 2 (F, G) = sup I F(A) -G(A)J

Equivalently, the distance

P2 (F, G) < P2 (F, G) < 2 P2 (F, G)

Convergence in either of these metrics is called strong convergence

(3) Weak convergence A sequence of distributions Fnis said to convergeweakly to a distribution F if

where X has the distribution F, and ~, independent of X, has a normal

* See [48], page 38 Every distribution F generates a linear functional (F, f)= f f (x) dF (x)

in the space C of continuous functions with limits at oo Weak convergence of distributions

is equivalent to weak convergence of the corresponding functionals, i e F„ F if and only if (F„,f)- (F,f) for all fe C

Trang 19

1 3

CONVERGENCE OF DISTRIBUTIONS

23

distribution with mean zero and variance a 2 Define a type of convergence

by saying that F„ -+ F if, for all a > 0,

the density of the distribution of c Then for any distribution G, G° is

a continuous distribution with density

Y

00 + IFn(A)-F(A)I

f-00

0°(x-A)dx+

00+IF.(-A)-F(-A)I

oo

Y

One may assume that A is taken to be a point of continuity of F Then the

dx

(1 3 5)

Trang 20

24

PROBABILITY DISTRIBUTIONS ON THE REAL LINE

Conversely, suppose that this holds Then

Suppose if possible that Fn =F Then there exists x 0 , a point of continuity

of F, and S > 0 such that

§ 4 Moments and characteristic functions

Tie moments oc v and absolute moments f3, of a random variable X with

di tribution F are defined respectively by

+0

(1 3 6)

(1 3 7)

Trang 21

1 4

MOMENTS AND CHARACTERISTIC FUNCTIONS

25

a v = EX" =

xvdF(x), 00

s

I

(r,>1), (r>s>O)

The characteristic function f (t) of X is defined by

(dts t=0for s=0, 1, 2, , k As t- O,

k

f (t) _

asi( it)S + o (tk )

s =0 S

It is a most important fact that addition of independent random variablescorresponds to multiplication of characteristic functions If the indepen-dent variables XL have respective characteristic functions fi(t), then thecharacteristic function of X 1 + X2+ +X„ is

f ( t) = f1 ( t) f2 (t) fn (t)

From (1 4 1) the characteristic function is uniquely determined by thedistribution function The converse is also true, and is expressed by therelation

Trang 22

If the distribution F has a density p, then f is just the Fourier transform

of p, and by the Riemann-Lebesgue theorem,lim If(t)I = 0

k

and consequently f is periodic with period 2n/h

Theorem 1 4 1 In order that a random variable X have a lattice tion, it is necessary and sufficient that I f(t o )I =1 for some t o 00

distribu-Proof IfX has a lattice distribution with step h, then

Trang 23

1 5 DISTRIBUTIONS AND CHARACTERISTIC FUNCTIONS

27

Theorem 1 4 2 If the step of the lattice distribution is h, then I f(2ir/h) I =1and I f(t) I < 1 for 0 < ItI < 27r/h

Proof Suppose that 0 < I to t < 2zc/h and I f(to) I = 1 Then the distribution

is concentrated on an arithmetic progression with step 27r/ I to I > h, whichcontradicts the definition of h

§ 5 Continuity of the correspondence between distributions andcharacteristic functions

The correspondence between probability distributions on the real lineand their characteristic functions is not only one-to-one, but also contin-uous in the following sense

Theorem 1 5 1 A sequence (F,,) of distributions converges weakly to adistribution F if and only if the corresponding sequence (f„) of characteristicfunctions converges uniformly in every bounded interval to the characteristicfunction f of F

For the proof of this theorem, see for example [47]

In the sequel we shall need various refinements of this theorem permitting

us, from the proximity of their characteristic functions, to estimate mity of distributions in the sense of different metrices It is convenient

proxi-to state these somewhat more generally for functions G of bounded tion The Fourier-Stieltjes transform

Theorem 1 5 2 Let A, T, e be positive constants, F a non-decreasing tion, G a junction of bounded variation, and f and g their characteristic func-tions If

func-(1) F(- cc) = G(- co) , F(cc)=G(oo),(2) G'(x) exists for all x and IG'(x)l < A ,(3)

J

T f (t)tg (t)

dt = e,

-T

Trang 24

28 PROBABILITY DISTRIBUTIONS ON THE REAL LINE Chap 1

then for each k > 1, there exists a number c(k) depending only on k with the property that, for all x,

F (x) - G (x) l < k

27r

+ c (k) A Moreover, c (2) < 24/it

The proof, due to Esseen [33], may be found in [48] (with the unnecessary restriction that S'-,,, I F (x) - G (x)l dx < cc), or in [105] (which contains the estimate for c(2))

Theorem 1 5 3 Let F be a non-decreasing purely discontinuous function (i.e of the form F=aF 1 + b where F 1 is a discrete distribution function),

G a function of bounded variation, and f, g their characteristic functions Suppose that

(1) F(- oo) = G(- oo), F(oo) = G(oo) , (2) the discontinuities of F and G are confined to a set { , x _1 , x0, x1, }

with xv+1 - x„ >, l for all v, (3) for all x outside this set, G' (x) exists and I G' (x)I <A,

T

f (t)-g (t) (4)

For proof, see [19] (page 214)

Theorem 1 5 4 Let T, b, e be constants, F and G functions of bounded variation, f and g their characteristic functions If

(1) F(- oo) = G(- oo), F(°o) = G(oo),

Trang 25

1 5

DISTRIBUTIONS AND CHARACTERISTIC FUNCTIONS

2 dt=b,

(It is possible to show that c < 4n.)

Proof. Denote by V the class of complex functions A (x) with boundedvariation

of functions in V It is clear that

hail = V (A)

is well-defined, and that

Ila,+a2II < Ilalll+IIa2II Ila, a2II < Ilaill IIa2II

Lemma 1 5 1 Suppose that a(t) is absolutely continuous, and that both

a (t) and a' (t) belong to L 2 (- oo, oo) Then a E V, and

ix 1 dx ,

- 00and

42)T

(1 5 3)

Trang 26

x2 la(x)12dx From (1 5 5), (1 5 6) and (1 5 7), we have

°°

dxMall=(2n)-2

f 00

I1+ix) la(x)l

11+ixl(2n)-2

001dx~

2

(1+x2)la(x)12dx 2 = +

Proof of theorem 1 5 4 Integrate by parts in the equation

Sc13 IF(x)-G(x)ldx =-

f (t) - g (t)

- it

(1 5 5)

(1 5 7)

Trang 27

1 5

DISTRIBUTIONS AND CHARACTERISTIC FUNCTIONS

<2-JT Ih(t)12dt+2

T Ih'(t)12dt+

Trang 28

Ilh(1-k)ll < Ilf-911 Ilull Ill-kll

{Ilfll+Ilgll} Ilull {1 +Ilkll} <

< 4(Var F+Var G)IIuII Taking in particular

u (t) = 4t/iT2 , for l tl < 2'T,

= 1/it, for ltl ,2T ,

we have Hull = ?r - 1

f 00

where c is an absolute constant Combining (1 5 11), (1 5 12), (1 5 13) proves the theorem (It is shown in [165] that the smallest possible value for hull is it/T )

§ 6 A special theorem about characteristic functions The following theorem will be needed later

Theorem 1 6 1 Let f (t) be any characteristic function, and v (t) = exp (iat - 262t2) the characteristic function of the normal distribution with mean a and variance a2 > 0 Let (t k ) be a sequence of points with tk :A 0,I'M tk = 0 If, for all k, f (t,)= v (t k), then f (t) = v (t) for all t

Proof Denote by F the distribution function corresponding to f Then there are two cases

sin tx u (t) dt 0

(1 5 12)

dx < c/T ,

(1 5 13)

Trang 29

1 6

A SPECIAL THEOREM ABOUT CHARACTERISTIC FUNCTIONS

The proof proceeds by induction To establish (1 6.1) when r=1, note that1-f(tk) = 2 ~~ sin)(Zt kx)dF(x)= 1-v(tk )= 0(tk)

f (2r - 1) ( Tk ) =

v(2r-1) (Zk)Then

00

- Co

x2 rdF(x) < ccandf (2r) exists withf(2r)(0)=v

(2 r)( 0)

Trang 30

§ 7 Infinitely divisible distributions

A distribution Fis said to be infinitely divisible if, for each n, there exists adistribution F" with

F=Fn*" Thus a random variable X with an infinitely divisible distribution can beexpressed, for every n, in the form

X=Xl n+X2n+ +Xnn ,

where the X;,, (j =1, 2, , n) are independent and identically distributed

Theorem 1 7 1 In order that the function f (t) be the characteristic tion of an infinitely divisible distribution it is necessary and sufficient that

where a >, 0, - oo < y < oo, and M and N are non-decreasing functions with

M (-oc) = N(oo) =0 and

So u 2 dM (u) +

` E

u2 dN(u) < oo.1

-E

0

for all s > 0 The representation (1 7 1) is unique.

The proof may be found in [48] (page 83) or in [47] (Chapter 9) Equation(1 7 1) is called Levy's formula Simple examples of infinitely divisibledistributions are the normal and the Poisson distributions, but we shallneed also a generalised form of the latter

The distribution F is called a compound Poisson distribution if it can

be represented in the form

Trang 31

where G is a distribution function, and p > 0 The characteristic functions

of F and G are related by the equation

00

k

f (t) = e-P I k g(t)k = exp {p(g(t) - 1)}

k-0 i

J co

= exp

(eitu - 1) d{PG (u) }

where the last expression is clearly a special case of (1 7.1)

Interest in the class of infinitely divisible laws is motivated by Khinchin'stheorem (1 7 2), which shows that only infinitely divisible distributionscan arise as limits of distributions of sums of independent randomvariables Consider, for each n, a collection of independent random vari-ables,

Xnl , Xn2, , Xnkn The Xnk are said to be uniformly asymptotically negligible iflira sup P (I Xnk I >18)= 0

n- cc k

for all s > 0 Theorem 1 7 2 In order that the distribution F should be, for an appro- priate choice of constants A n , the weak limit of the distributions of

Trang 32

36

PROBABILITY DISTRIBUTIONS ON THE REAL LINE

n • 00

k=1

, lxl < e

-xdFnk(x)

= a2 ,

E-0

n c0

Trang 33

Chapter 2

STABLE DISTRIBUTIONS ; ANALYTICAL PROPERTIES ANDDOMAINS OF ATTRACTION

§ 1 Stable distributions

Definition A distribution function F is called stable if,for any a 1 , a 2 >0

and any b 1 , b2 , there exist constants a > 0 and b such thatF(a l x+b l ) * F(a 2 x+b2) = F(ax+b)

Zn= Xl+X2+ + Xn

_ AB

n

(2 1 3)n

of stationarily dependent random variables In this section we establishthis result for independent random variables ; the general case is dealt with

in Theorem 18 1 1

Theorem 2 1 1 In order that a distribution function F be the weak limit

of the distribution of Z n for some sequence (Xi) of independent identicallydistributed random variables, it is necessary and sufficient that F be stable

If this is so, then unless F is degenerate, the constants B n in (2.1 3) must takethe form B n = n' lx h (n), where 0 <a,<2 and h (n) is a slowly varying function

in the sense of Karamata

Trang 34

38

STABLE DISTRIBUTIONS

Chap 2

Proof Letfbe the common characteristic function of the X1 , and let 0

be the characteristic function corresponding to the distribution F Since

a degenerate distribution is trivially stable, we exclude this case, and provethat necessarily

lim Bn = oo, lim Bn+ 1/ Bn = 1

(2 1 4) n- ac

Bnt)10(t) I = 1

(2 1 5) n

n+1

IfB n+1 /Bn+l,we can find a subsequence of either (B n+1 /B n)or(B n /B n+1 )

converging to some B < 1 Going to the limit in (2.1 5) we arrive at theequation 0 (t)= 4 (Bt), from which

10 (t) I= 11(Bn t) I -) 10(0)I =1,

n- o0

which is again impossible unless F is degenerate Thus (2.1.4) is proved Now let 0<a 1 <a 2 and b l , b 2 be constants Because of (2 1 4) we canchoose a sequence (m(n)) such that, as n-+oo,

B M a 2

m 0

B n

a l

B n

Trang 35

§ 2 Canonical representation of stable laws

Theorem 2.2 1 In order that a distribution F be stable, it is necessary and sufficient that F be infinitely divisible, with Levy representation either

-

" 1 + B

B M

Trang 36

( eitt_ 1

- 1 +u2 dM (u) +

0D (

itu (e_iitu

0 (

eitu- 1

-1 +u2) dN(a1 u) +0

0 + iya2 lt - 2 6 2 a2 2 t2

+ J -

+u2 dN (a2 U)+ ibt 1

1

Trang 37

m {x + 2 (s) } = sm (x)

(2 2 11)Moreover, it follows from this equation that

lim 2 (s) = oo ,

lim A(S) _ - co S-0

S-00Since m is not identically zero, we may assume that m (0) =A 0 (otherwiseshift the origin), and write m1(x)=m(x)/m(0) Let X1, x2 be arbitrary,and choose s 1 , s 2 so that

Trang 38

s1 m(0) = m(x 1 ), s 2 m(0) = m(x 2 ), s 2 m(x 1 ) = m(x 1 +x2 ) ,

so thatm1(x1+x2) = m1(x1)m1(x2) Since m 1 is non-negative, non-increasing and not identically zero,(2 2 12) shows that m l > 0, and then m 2 = 109m, is monotonic and satisfiesM2(X1 +x2) = m2(x1)+m2(x2)

a -a = a - Q = 2,

(2 2 16)whence a= J3 Moreover, (2.2 5) becomes in this case

Q2 (a -2 -2)=0 This is incompatible with (2 2 16) unless a2 = 0, so that either U2=0 or

M (u) = N (u) = 0 for all u

The integrals on the right-hand side of (2.2 1) can be evaluated explicitly,enabling the theorem to be reformulated in the following way

(2.2 12)

Trang 39

2.2

CANONICAL REPRESENTATION OF STABLE LAWS

Proof. We examine (2.2 1) in three cases

The function(e'u-1)/ul+a

is analytic in the complex plane cut along the positive half of the real axis Integrating it round a contour consisting of the line segment (r, R) (0 < r < R), the circular arc (with centre 0) from R to iR, the line segment(iR, ir), and the circular arc from it to r, we obtain (on letting R-+cc andr-+0)

where

(e`"-1) du0

) i++ "

cl J

Trang 40

-00 (e-"'-1)

du

0

u l+« e2"`« L (a ) ,and therefore for t>0,

log f (t) = iy't + aL (a) to'l (c 1 + c 2) cos (27ta) + i (c 1 - c 2) sin

= iy't-ct«(1-i/3 tan (2ia)) ,where

c = -aL(a)(c 1 +c 2) cos (2ita) >1 0 ,

f/3 = (Cl - c2)/(c1 +c 2 ) ,

I /3I < 1 For t<0.

log f (t) = log f ( -t) = iy't - c jtI«(1- i/3 tan (-21 na)) ,

so that (2.2 17) holds for all t

0logf (t) = iy't+c 1 aJ - ( e""-1-itu)

Iu~

du00

(e""-1-itu) 1+« _o

Integrating the function(e -` -1 + lu)/ul +«

round the same contour as above, we obtain

°°

du(e-"'-1+iu) u l+« = e-2-"`«M(a),0

.f

°°

du(e"'-1-iu)u1+«

0(2) 1<a<2 For this case we can throw (2.2 1) into the form (fort>0)

(2ra) } =

Ngày đăng: 08/04/2014, 12:28

TỪ KHÓA LIÊN QUAN

TÀI LIỆU CÙNG NGƯỜI DÙNG

TÀI LIỆU LIÊN QUAN

🧩 Sản phẩm bạn có thể quan tâm