Chapter 17CONDITIONS OF WEAK DEPENDENCE FOR STATIONARY PROCESSES The past history of the process Xt is described by the a-algebras SJX- S, the future by the o-algebras 9J1 +S.. 302 WEAK
Trang 1Chapter 17
CONDITIONS OF WEAK DEPENDENCE FOR STATIONARY PROCESSES
The past history of the process Xt is described by the a-algebras SJX- S, the future by the o-algebras 9J1 +S It may be that these o-algebras are inde-pendent, in the sense that, for all A E9J1`7' , B EJJ1+ S,
P (AB) - P (A) P (B) = 0
In the general case, the magnitude of the left-hand side measures the de-pendence between past and future, and it may be useful to assume this to
be small, in some sense In this chapter we examine some of the possible ways of limiting the dependence
© 1 Regularity
Definition 17 1 1 A stationary process XX is said to be regular if the o-algebra
=nw
is trivial in the sense that it contains only events of probability zero or one
The famous zero-one law for independent random variables (see, for example, [59], [31]) implies that, for instance, a sequence of independent, identically distributed random variables is regular
In the Hilbert space terminology of the last chapter, regularity simply means that the subspace
H_~=H_~=n H`_ 00
t (which consists of the random variables measurable with respect to 9JL_ ,,,,) contains only the constant functions
Trang 2302 WEAK DEPENDENCE FOR STATIONARY PROCESSES Chap 17
Theorem 17 1 1 In order that a stationary process X t be regular, it is necessary and sufficient that, for all BEM,
f-'-w AEY t - m
Proof. To prove the necessity of the condition, write yA for the indicator function of an event A ;
xA (w) _
( wEA ) ,
and set ~ =YA -P (A), ri =XB -PB, so that
P (AB) - P (A) P (B) = E (~>7) Since ~ is measurable with respect to 9J?t ,,,, equation (1 1 3) gives
IE(~i1)J = IE{~E( ;1I9` )}I < (E 2 ) 2 (E{E(rll 9JZ` x) 2 ) 2
(E{E(q I JJl`-~)2})4-'0
as t-* - oc, in virtue of the theorem of Appendix 3
To prove the sufficiency, suppose to the contrary that (17 1.1) is satisfied, but thatX tis not regular Then there is an eventA E S9J"t -,,, with0 < P (A) < 1, and then
sup IP (AB) - P (A) P (B) I% IP(A)-P(A) 2 1
> 0, BeUl-~
which contradicts (17 1 1)
Corollary 17 1 1 A regular process X, is metrically transitive
Proof. Let A be an invariant event For any e > 0 we can find a finite t and an event A EE S9)tt t such that
P { (A - A E ) u (A E - A) } <e From (17 1 1),
lim IP(T-`-SAE nA)-P(A E )P(A)I =0 But
S-00
P(T - r - SA E n A) = P{T`+S(T- r -SA E n A)} =
= P(A E n T`+S A ) = P(AEA) ,
so that
Trang 317 1
REGULARITY
3 0 3
P (A, A) = P (A) P (AE)
Letting E-*0, we have P(A)=P(A)', so that P (A) = 0 or 1 From the proof of Theorem 17 1 1 it is clear that one can state it in the apparently stronger form
For the regularity of the process X t it is necessary and sufficient that
lira sup I E (~rl) - E (~) E (rl)I = 0,
(17 1 2)
t - ao for all i c Ham , where the supremum is taken over all ~ EH`_ with E I 2 ,1 The condition of regularity can be described geometrically in the follow-ing way Denote by Pt the projection operator onto the subspace H`_ ~ Then it is easy to see that Xt is regular if and only if, for all q c- H,,,,
lim I IPt rl I I= 0
Theorem 17 1 2 If the stationary process Xt is regular, and if Y E H(" (Xt), then the stationary process Y= Ut Y has an absolutely continuous spectral function
Proof We call a stationary process Xt (in the wide sense) linearly regular if
r f- ,,, (X) = 0 t
Since L- 00 c Ht- 00 , every regular process with E J X,2 1 < oo is linearly regular Moreover, if Xt is regular, and Y E H'-,,,, (X) (s < co), then the process Y = U t Y is linearly regular In particular, the collection of variables Y E H,o (X) for which Y = Ut Y is linearly regular is dense in H,o From this, we prove that all linearly regular processes (and hence
a fortiori all regular processes) have spectral densities For simplicity, we consider only the discrete time case
Lemma 17 1 1 (Wold decomposition) A sequence XX stationary in the wide sense is linearly regular if and only if it is representable in the form
0
Xi = I akck+j,
(17 1 3) k= -)c
where 1 Ia k I 2 < cc, and ~, = U'~ 0 E L', (X) are orthogonal random variables
Trang 4304 WEAK DEPENDENCE FOR STATIONARY PROCESSES Chap 17
Proof If the sequence Xj is represented in the form (17 1 3), then L- x, (X) ( L- 7 ( ~_ )'
so that X1 is linearly regular Conversely, let the sequence Xj be linearly regular We denote by L~j the orthogonal complement of L= (X) in L_ (X), so that
L_ U ,= L-1000QLj The dimension of Lj clearly does not exceed 1 If it is zero, then L= ~ = L- = UL-00 0 ,
which plainly contradicts the linear regularity Hence L1 j has dimension equal to 1
We now show that
j
k=-oc
In fact, for all s<j, L- (X)=L-010 (X) A DA+l O+ Q+ L,,
and because
'1 I:-,(X)=0
j the projection of any Y E L a (X) onto E-,, (X) tends to zero as s-* - oo Because of (17 1 5), Xj has a representation of the form
j
Xj= L ak ;k ,
k=-oo where Ek`akj12 < oo, bkj cLk, and the ~kj are orthonormal Because Lk is
one-dimensional, ~ k " does not depend on j (except by a factor of unit modulus, which may be absorbed into a k ) Since Xj = U' X0 ,
Xj= I ak0 ~k+ j = I ak ~k
Combining this lemma with the results of © 16 7 we see that the linearly regular process Xj has a spectral density
.f (') =
0
1 ake`i,k
-2
L-, (X) ( L-a (~)
l a_ k et ',k
0
(17 1 6)
Trang 517 2
THE STRONG MIXING CONDITION
305
Remark The results of © 16 7 show that, conversely, a stationary se-quence with a spectral density of the form (17 1 6) allows of the expansion (17 1 3), and is thus linearly regular But (17 1 6) means that f (~) = (e")1', where 0 (e`A) is the value at z = e" of a function 0 (z) analytic inside the unit disc Iz) < 1 and satisfying
rz
f -n 10 (e`1-) 2& < co
The theory of boundary values of such functions ([136], chapter II) shows that such a representation forf (A,) is possible if and only if
f
n
- 7C
log f (A) dA > - co
(17 1 7)
Thus (17 1 7) is a necessary and sufficient condition for linear regularity Returning to the proof of the theorem, let Y E He (X) We show that the spectral function of Y; = U' Y is absolutely continuous In fact, because of the regularity, if e is any positive number, there exists N < co such that, if
Y = Y(N)+ Z(N) , with Y(N)EHN00(X), Z(N) 1 HN " 0 (X), then
E IZ(N)I2 < E
The spectral function F,, ( ; )of Yis the sum of the spectral functionFy( N ) ( ?)
ofY(N) and Fz( N) (A)of Z(N) The process Y1 N 'is linearly regular and Fy(N) (A)
is absolutely continuous Thus the total variation of the singular compo-nent ofFy ( ?) does not exceed that of Fz(A,)which since
dFz( N) ().) = E ~Z0)I2 < c ,
is arbitrarily small Thus F(A)is absolutely continuous and the theorem is proved
© 2 The strong mixing condition
If we strengthen (17 1 1) by requiring it to hold uniformly in B as well as A,
we arrive at the following definition Definition 17 2 1 A stationary process X X is said to be strongly mixing (or completely regular) if
Trang 6306 WEAK DEPENDENCE FOR STATIONARY PROCESSES Chap 17
Ae9W•_ ,,Be9J1,•
as t * x through positive values
The non-increasing function a(r)will be called the mixing coefficient It is
of course clear that a strongly mixing process is necessarily regular A sequence of independent random variables is strongly mixing ; other examples will appear in ©© 17 3, 19 1, 19 2, 19 4
Theorem 17 2 1 Let the stationary process X t satisfy the strong mixing condition If ~ is measurable with respect to and ri with respect to
+t (z> 0), and if ICI < C1, I71I < C 2 , then IE(~17)-E(~)E(g)I < 4C 1 Cz a(T) (17 2 2)
Proof. We may clearly assume that t = 0 Using the properties of con-ditional expectations stated in © 1 1, we have
IE(~'7)-E(~)E(q)I =IE{~[E('7I9W• ,~j -E(ij)]}I
C l EIE( 17I SJJ2•_,,,) - E('1)I = C1E{~1[E(>7I 9J1•-~j - E(rl)]} where
~1 =sgn {E(i7I9J1•-~)-E(rJ)}
Clearly 5 1 is measurable with respect to t11• ~, and therefore
IE(~ii)-E(~)E(,1)I < C1IE(~1?) - E(~1)E( 11)I Similarly, we may compare , with
ill = sgn {E(~1IT1~•)-E(~1)}
to give
IE(~ii)-E(K)E( ;7)I < , Cl C2IE(~1ill) -E(~1)E( ;11)I Introducing the events
A={~1=1}e9J1•_,,, , B={17 i =1}E9t~•, the strong mixing condition (17 2 1) gives
Trang 717 2
THE STRONG MIXING CONDITION
307
IE(~1g1)-E(~I)E(nI)1
I P (AB) + P (AB) - P (AB) - P (AB) - P (A) P (B) +
-P(A)P(F3)+P(A)P(B)+P(A)P(B)I,<
4a (t) , whence (17 2 2) follows
If the variables ~, i are complex, then separating the real and imaginary parts, we again arrive at (17 2 2), with 4 replaced by 16
Theorem 17 2 2 Let the random variables ~, q be measurable with respect
to 9J .0 and M ' respectively, and suppose that, for some b > 0,
EgI2+a<c 1 <oo ,
EIrjl2+a
<c 2 <oo
(17 2 3)
Then IE(~i)-E(~)E(q)I <[4+3(cflci-a+c'-Ici)]c ('r)1-2/3
(17 2.4)
where
/3 = (2+b)-1
Proof. As before, we take t = 0 Introduce the random variables bN, defined by
~ - f ~ (ICI <N) ,
o (I~i>N),
~N = ~-~N
and qN, 4N similarly defined Then
I E(~q)-E(~) E(n)I < I E(~NW-E(W E(7N)I +
+IE(~NJN)I +IE(WE(iN)I +IE(WE(W I + +IE(~NQ I +IE(W E(11N)I ,
( 17 2 5) and by Theorem 17 2 1,
IE(~NgN)-E(W E(11N)I < 4N2a(i)
(17 2 6) Because of (17 2 3),
EI~ 7 NI < N-`EI~NI1+s
< N - (5c' -r ,
EIlNI < N "c2 -1 ,
Trang 8308 WEAK DEPENDENCE FOR STATIONARY PROCESSES Chap 17
so that
IE((N'IN)I < iEI~N1(2+a)/(l+a) }
1-P{EI1N I2+a}fl < N-acl-flc2 ,
IE(LNl7N)I <N-hC 1 C2 ,
IE(~NYIN)I < N 'cflcl
Q '
JE(~Nr1N)I < N
-"c' -flc2
Combining (17 2 5)-(17 2 7), we have
IE(~rl)-E(~)E(~1)I < 4N2a(T)+3N-a(c2c2-Q+ci -~C2)
whence (17 2 4) follows on setting N=a(T)-R The left-hand side of (17 2 1) can be small either because P(B ) A) is near
P (B), or because P (A) is small This suggests that we should consider a stronger mixing condition which requires the difference to be small com-pared with P(A)
Definition 17 2 2 The stationary process X, is said to satisfy the uniform mixing condition if
sup IP(AB) A(A)P(B)I = ~( T) 0 (17 2 8)
AEWit m , Be Ult+c P( )
It is clear that 0(T) is non-increasing, and that a uniformly mixing process
is strongly mixing (the converse is false ; see © 3)
The essential supremum of a random variable ~ (w) is the unique number
C < oc with the property that P ( > C) = 0but P (> C')> 0 for all C'< C
We remark that, if
I P (AB) - P (A) P (B) I=
If
A [P(B I9J2`_ ,,,,) -P(B)] dP(w)I < 0 1 (T)P(A) ,
showing that 0 (T) < 1 (T) Conversely, for any e > 0, choose A E E 9J1t_~
and BEE 9M so that P (A j > 0 and for all weA E,
Cpl (T) = sup ess sup IP(BI9Jtr j -P(B)I , (17 2 9)
(17 2 10)
BESUtt+ t then
O (T) = 01 (T)
In fact,
Trang 917 2
THE STRONG MIXING CONDITION
309
IP(BEI` _ ) - P(BE)I > 01(r) - E
Without loss of generality we may suppose that, for all w e A E , P(BE I9JI`-~)-P(B E ) > 01(r)-E
Integrating this inequality over A, we obtain
P (A E B E ) - P(AE BE) > [41 (T) - E] P (AE)
showing that0(t)> 01(t) - E Since e > 0 is arbitrary, this proves (17 2 10) Theorem 17 2 3 Let the stationary process X t satisfy the uniform mixing
respectively If EI~ip < oo ,
Elrilq < oo , where p, q > 1, p -1 + q -1 =1, then IE(~i)-E(~)E(ri)I < 2{0 (i)EI~I P }1/p{Elnlq} 1/q
(17 2 11)
Proof Suppose first that c and q are represented by finite sums
Aix(A)q
_
§ix(Bi) i
where the A ;are disjoint events in 9Jt 0 ,and the B i are disjoint events in
9J2 r+t Then, using Holder's inequality,
IE(~11)-E(~)E(rl )l =
.l,§i P(A j Bi )- Y A j y i P(A) P(B) _
i,i
~ )i;P(Aj)1/p [P(BiIA,)-P(B)]piP(A) 1/q
1/p
<1 I I IA IP P (A ;)}
JP(A) Y' §i[P (B1I A) - P(Bi)]
i
i
{El~lp}1/p EP(A) {Rl llqI(P(B 1 A J )+P(B l
) x l
i
P(B iIA ;) - P(Bi)
S
q 1/q
Trang 10310 WEAK DEPENDENCE FOR STATIONARY PROCESSES Chap 17
2 11 P{E15I P } llP {El17lq}' lq max{~ IP(B i IA j)-P(B i) I l/P
(17 2 13) Denoting the summation over positive terms by Y_', and over negative terms by E - , we have
~IP(BiIA j )-P(B) I=
i
{P(Bil Aj)-P(B,)} - {P(BiI Aj) - P(Bj)} _
i _~ P U+
Bi Aj)-PC U+ B i l }+
JBi A j - v - B i 20 (t) ( 17 2 14)
i Substituting (17 2 14) into (17 2 13) proves the theorem for variables of the form (17 2 13)
For the general case, it suffices to remark that
EI~-SNI P >0,
El q -g N lq-> 0,
as N > oo, where ~ N , 17 N are random variables of the form (17 2 12), ~N being defined by
k/N (k/N< <(k+1)/N, -N 2 <k<N 2),
N
0 (I i > N) and 11 N similarly
All the concepts and theorems of ©© 1,2 apply equally to processes defined only for t > 0, so long as 9JI`_ ,,,, is replaced throughout by 9o
© 3 Conditions of weak dependence for Gaussian sequences
The random vector X = (Xi , X2 , , X„) is said to be Gaussian if its charac-teristic function is of the form
x(91, 02, , 6n) = expI i aj 9j-2 E R kjekej , (17 3 1)
where aj are arbitrary real numbers, and the matrix R=(R kj) is positive-semi-definite
Trang 1117 3 WEAK DEPENDENCE FOR GAUSSIAN SEQUENCES
311
It is easy to see that
a j = E(Xj), Rkj = E {( Xk -ak)(Xj - aj)}
If the matrix R is non-singular, then X has the probability density (where IRI is the determinant of R) ,
p(x1, x2, , xn) = (2 t) Z" IRI ? exp {- LErkj(xk -ak)(xj - aj)} , where the matrix (rk j) is the inverse ofR Conversely, any positive-semi-definite matrix R defines the distribution of a random vector with characteristic function (17 3.1) The following properties are immediate consequences of the definition
(1) The variables X1 , X2 , , Xn are independent if and only if R kj =O for k 0j, i e if and only if they are uncorrelated
(2) IfXj = (XI j , X2 j, , Xnj), and if the vector (X1, X2, Xm) is Gaussian, then Y-, T= bj Xj is Gaussian for all real b j
(3) If the sequence of Gaussian vectors Xj converges to a random vector X, then X is Gaussian
The random process Xt (t e T) is said to be Gaussian if,for any t 1 , ,
th e T, the vector (X11 , Xt2 , , Xj is Gaussian
It follows from what has been said that the finite-dimensional distributions
of a Gaussian process are determined by the two functions
= E (Xt), Rt5 = E { (Xt - at) (XS - as) } Conversely, Kolmogorov's theorem implies that there exists a Gaussian process determined by these two functions, provided only that the func-tionR t5is such that, for any tj , the matrix(R tj , t ) is symmetric and positive-semi-definite
In particular, if(Xj) is a stationary Gaussian sequence, then any condition
of weak dependence of the a-algebras SJJ1k J , Jnn+k can, in principle, be expressed in terms of the autocovariance function of the sequence, or of the spectral function Such expression may be far from simple, and raises difficult and interesting analytical problems These lie away from the theme of this book, and we shall discuss them only in order to construct examples of processes satisfying the conditions of ©© 1, 2
Theorem 17 3 1 A Gaussian sequence Xj is regular if and only if it is linearly regular