1. Trang chủ
  2. » Khoa Học Tự Nhiên

Báo cáo hóa học: " Research Article Almost Sure Central Limit Theorem for a Nonstationary Gaussian Sequence" ppt

10 329 0
Tài liệu đã được kiểm tra trùng lặp

Đang tải... (xem toàn văn)

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 10
Dung lượng 490,66 KB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

Volume 2010, Article ID 130915, 10 pagesdoi:10.1155/2010/130915 Research Article Almost Sure Central Limit Theorem for a Nonstationary Gaussian Sequence Qing-pei Zang School of Mathemati

Trang 1

Volume 2010, Article ID 130915, 10 pages

doi:10.1155/2010/130915

Research Article

Almost Sure Central Limit Theorem for a

Nonstationary Gaussian Sequence

Qing-pei Zang

School of Mathematical Science, Huaiyin Normal University, Huaian 223300, China

Correspondence should be addressed to Qing-pei Zang,zqphunhu@yahoo.com.cn

Received 4 May 2010; Revised 7 July 2010; Accepted 12 August 2010

Academic Editor: Soo Hak Sung

Copyrightq 2010 Qing-pei Zang This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited

Let{Xn; n ≥ 1} be a standardized non-stationary Gaussian sequence, and let denote S nn

k1 X k,

σ n VarSn Under some additional condition, let the constants {uni; 1≤ i ≤ n, n ≥ 1} satisfy

n

i1 1 − Φuni → τ as n → ∞ for some τ ≥ 0 and min1≤i≤nu ni ≥ clog n1/2, for somec > 0, then,

we have limn → ∞1/ log nn k1 1/kI{∩ k

i1 Xi ≤ uki, Sk /σ k ≤ x}  e −τ Φx almost surely for any

x ∈ R, where IA is the indicator function of the event A and Φx stands for the standard normal

distribution function

1 Introduction

When{X, X n;n ≥ 1} is a sequence of independent and identically distributed i.i.d. random

variables andS n n k1 X k , n ≥ 1, M n  max1≤k≤nX kforn ≥ 1 If EX  0, VarX  1, the

so-called almost sure central limit theoremASCLT has the simplest form as follows:

lim

n → ∞

1 logn

n



k1

1

k I

S

k

k ≤ x



 Φx, 1.1

almost surely for all x ∈ R, where IA is the indicator function of the event A and

Φx stands for the standard normal distribution function This result was first proved

independently by Brosamler1 and Schatte 2 under a stronger moment condition; since then, this type of almost sure version was extended to different directions For example, Fahrner and Stadtm ¨uller3 and Cheng et al 4 extended this almost sure convergence for partial sums to the case of maxima of i.i.d random variables Under some natural conditions, they proved as follows:

lim

n → ∞

1 logn

n



k1

1

k I

M

k − b k

a k ≤ x



 Gx a.s. 1.2

Trang 2

for allx ∈ R, where a k > 0 and b k ∈ R satisfy

P

M

k − b k

a k ≤ x



−→ Gx, as k −→ ∞ 1.3

for any continuity pointx of G.

In a related work, Cs´aki and Gonchigdanzan5 investigated the validity of 1.2 for maxima of stationary Gaussian sequences under some mild condition whereas Chen and Lin6 extended it to non-stationary Gaussian sequences Recently, Dudzi´nski 7 obtained two-dimensional version for a standardized stationary Gaussian sequence In this paper, inspired by the above results, we further study ASCLT in the joint version for a non-stationary Gaussian sequence

2 Main Result

Throughout this paper, let{X n;n ≥ 1} be a non-stationary standardized normal sequence,

andσ n  VarSn  Here a b and a ∼ b stand for a  Ob and a/b → 1, respectively Φx is the standard normal distribution function, and φx is its density function; C will

denote a positive constant although its value may change from one appearance to the next Now, we state our main result as follows

Theorem 2.1 Let {X n;n ≥ 1} be a sequence of non-stationary standardized Gaussian variables with covariance matrix r ij  such that 0 ≤ r ij ≤ ρ |i−j| for i / j, where ρ n ≤ 1 for all n ≥ 1 and

sups≥ns−1

is−n ρ i log n1/2 /log log n1 ε, ε > 0 If the constants {u ni; 1≤ i ≤ n, n ≥ 1} satisfy

n

i1 1 − Φu ni  → τ as n → ∞ for some τ ≥ 0 and min1≤i≤nu ni ≥ clog n1/2 , for some c > 0, then

lim

n → ∞

1 logn

n



k1

1

k I

k

i1

X i ≤ u ki , S σ k

k ≤ x

 e −τ Φx, 2.1

almost surely for any x ∈ R.

Remark 2.2 The condition sup s≥ns−1

is−n ρ i log n1/2 /log log n1 ε, ε > 0 is inspired by

a1 in Dudzi´nski 8, which is much more weaker

3 Proof

First, we introduce the following lemmas which will be used to prove our main result

Lemma 3.1 Under the assumptions of Theorem 2.1 , one has



1≤i<j≤n

r ijexp

u

2

ni u2

nj

2 1 r ij



log logn1 ε. 3.1 Proof This lemma comes from Chen and Lin6

Trang 3

The following lemma isTheorem 2.1and Corollary 2.1 in Li and Shao 9.

Lemma 3.2 (1) Let {ξ n } and {η n } be sequences of standard Gaussian variables with covariance

matrices R1 r1

ij  and R0 r0

ij , respectively Put ρ ij  max|r1

ij |, |r0

ij | Then one has

P

n

j1



ξ j ≤ u j

⎠ − P

n

j1



η j ≤ u j

≤ 1

2π



1≤i<j≤n

 arcsin

r1

ij



− arcsinr0

ij



exp

u

2

i u2

j

2 1 ρ ij



,

3.2

for any real numbers u i , i  1, 2, , n.

(2) Let {ξ n;n ≥ 1} be standard Gaussian variables with r ij  Covξ i , ξ j  Then





P

n

j1



ξ j ≤ u j⎞⎠ −n

j1

P ξ j ≤ u j



 ≤ 14



1≤i<j≤n

r ijexp − u2

i u2

j

2 1 r ij



, 3.3

for any real numbers u i , i  1, 2, , n.

Lemma 3.3 Let {Xn } be a sequence of standard Gaussian variables and satisfy the conditions of

Theorem 2.1 , then for 1 ≤ k < n, one has

P

n

ik 1

{X i ≤ u ni }, S σ n

n ≤ y



− P

n

i1

{X i ≤ u ni }, S σ n

n ≤ y



k n C

log logn1 ε 3.4 for any y ∈ R.

Proof By the conditions ofTheorem 2.1, we have

σ nn 2 

1≤i<j≤n

r ij ≥√n, 3.5

then, for 1≤ i ≤ n, by sup s≥ns−1

is−n ρ i log n1/2 /log log n1 ε, ε > 0, it follows that

Cov



X i , S n

σ n



≤ √1

n

1

n

n



k1

ρ k logn

1/2

n log logn1 ε. 3.6

Then, there exist numbersδ, n0, such that, for anyn > n0, we have

sup

1≤i≤nCov



X i , S σ n

n



< δ < 1

Trang 4

We can write that

L : P

n

ik 1

{X i ≤ u ni }, S σ n

n ≤ y



− P

n

i1

{X i ≤ u ni }, S σ n

n ≤ y



≤



P

n

ik 1

{X i ≤ u ni }, S σ n

n ≤ y



− P

n

ik 1

{X i ≤ u ni}



P Y n ≤ y



 



P

n

i1

{X i ≤ u ni }, S σ n

n ≤ y



− P

n

i1

{X i ≤ u ni}



P Y n ≤ y





P

n

ik 1

{X i ≤ u ni}



− P

n

i1

{X i ≤ u ni}



: L1 L2 L3,

3.8

where {Y n } is a random variable, which has the same distribution as {S n /σ n}, but it

is independent of X1, X2, , X n  For L1, L2, apply Lemma 3.2 1 with ξ i  X i , i 

1, , n; ξ n 1  S n /σ n , η j  X j , j  1, , n; η n 1  Y n  Then r1

ij  r0

ij  r ijfor 1≤ i < j ≤ n

andr1

ij  CovX i , S n /σ n , r0

ij  0 for 1 ≤ i ≤ n, j  n 1 Thus, we have for i  1, 2

L i n

i1

Cov



X i , S σ n

n

 exp

−21 CovXu2ni y2

i , S n /σ n



. 3.9

Since3.5, 3.7 hold, we obtain

L i logn

1/2

n log logn1 ε

n



i1

exp

u2ni

21 δ



. 3.10

Now defineu nby 1− Φu n   1/n By the well-known fact

1− Φx ∼ φx x , x −→ ∞, 3.11

it is easy to see that

exp

u2n 2



2πu n



2 logn. 3.12

Trang 5

Thus, according to the assumption min1≤i≤nu ni ≥ clog n1/2, we haveu ni ≥ cu nfor somec > 0.

Hence

L i≤ logn

1/2

n log logn1 ε



1≤i≤n

exp

u2ni

21 δ



n logn1/2

log logn1 ε exp

−21 δu2n



n2 logn2 δ/1 δ

n1/1 δ log logn1 ε



logn2 δ/1 δ

n1/1 δ−1/2

1

n δ , δ > 0.

3.13

Now, we are in a position to estimateL3 Observe that

L3 P

n

ik 1

{X i ≤ u ni}



− P

n

i1

{X i ≤ u ni}



≤



P

n

ik 1

{X i ≤ u ni}



− n

ik 1 Φu ni









P

n

i1

{X i ≤ u ni}



−n

i1 Φu ni



 





n



ik 1

Φu ni −n

i1

Φu ni





: L31 L32 L33.

3.14

ForL33, it follows that

L33 n

ik 1

Φu ni

1−k

i1

Φu ni



1 − Φk u n

 1 −



1−n1

k

k n

3.15

ByLemma 3.22, we have

L3i≤ 1 4



1≤i<j≤n

r ijexp

u

2

ni u2

nj

2 1 r ij



Thus byLemma 3.1we obtain the desired result

Trang 6

Lemma 3.4 Let {Xn } be a sequence of standard Gaussian variables satisfying the conditions of

Theorem 2.1 , then for 1 ≤ k < n, any y ∈ R, one has





Cov

I

k

i1

{X i ≤ u ki }, S σ k

k ≤ y



, I

n

ik 1

{X i ≤ u ni }, S σ n

n ≤ y







k n

logn1/2

log logn1 ε

1 log logn1 ε.

3.17

Proof ApplyLemma 3.21 with ξ i  X i , 1 ≤ i ≤ k, ξ k 1  S k /σ k , ξ i 1  X i , k 1 ≤ i ≤

n, ξ n 2  S n /σ n , η j  ξ j , 1 ≤ j ≤ k 1, η j  ξ j , k 2 ≤ j ≤ n 2, where ξ k 2 , , ξ n 2 has the same distribution asξ k 2 , , ξ n 2 , but it is independent of ξ k 2 , , ξ n 2 Then,

r1

ij  r0

ij for 1≤ i < j ≤ k 1 or k 2 ≤ i < j ≤ n 2;

r1

ij  r ij−1 , r0

ij  0 for 1 ≤ i ≤ k, k 2 ≤ j ≤ n 1;

r1

ij  Cov



X i , S σ n

n



, r0

ij  0 for 1 ≤ i ≤ k, j  n 2;

r1

ij Cov



X i , S σ k

k



, r0

ij  0 for k 1 ≤ i ≤ n, j  k 1;

r1

ij Cov

S

k

σ k , S n

σ n



, r0

ij  0 for i  k 1, j  n 2.

3.18

Thus, combined with3.5, 3.7, it follows that





Cov

I

k

i1

{X i ≤ u ki }, S k

σ k ≤ y



, I

n

ik 1

{X i ≤ u ni }, S n

σ n ≤ y









P

k

i1

{X i ≤ u ki }, n

ik 1

{X i ≤ u ni }, S σ k

k ≤ y, S σ n

n ≤ y



−P

k

i1

{X i ≤ u ki }, S σ k

k ≤ y



P

n

ik 1

{X i ≤ u ni }, S σ n

n ≤ y





≤ 1

4



1≤i≤k



k 1≤j≤n

r ijexp

u

2

ki u2

nj

2 1 r ij

 1 4

k



i1

Cov



X i , S σ n

n

 exp

2

ki y2

21 CovXi , S n /σ n



1

4

n



ik 1

Cov



X i , S σ k

k

 exp

−21 CovXu2ni y2

i , S k /σ k

 1

4Cov

S

k

σ k ,

S n

σ n



≤ 1

4



1≤i≤k



k 1≤j≤n

r ijexp

u

2

ki u2

nj

2 1 rij

 1 4

k



i1

Cov



X i , S σ n

n

 exp

u

2

ki

21 δ



1

4

n



ik 1

Cov



X i , S σ k

k

 exp

−21 δu2ni

 1

4Cov

S

k

σ k ,

S n

σ n



: T1 T2 T3 T4.

3.19

Trang 7

UsingLemma 3.1, we have

log logn1 ε, ε > 0. 3.20

By the similar technique that was applied to prove3.10, we obtain

ForT3, by sups≥ns−1

is−n ρ i log n1/2 /log log n1 ε, ε > 0, and 3.12, we have

T3 exp

−21 δu2n

n

ik 1

Cov



X i , S σ k

k



1

n1/1 δ

n



ik 1

Cov



X i , S σ k

k



1

n1/1 δ

1

k

n



ik 1

CovXi , S k

1

n1/1 δ

1

k

k



j1

n



ik 1

Cov X i , X j

1

n1/1 δ

1

k

k



j1

n



i1

ρ i

k

n1/1 δ

logn1/2

log logn1 ε

1

n β , β > 0.

3.22

As toT4, by3.5 and 3.6, we have

T4 σ1

k

k



i1

Cov



X i , S σ n

n





k n

logn1/2

log logn1 ε. 3.23

Thus the proof of this lemma is completed

Proof of Theorem 2.1 First, by assumptions and Theorem 6.1.3 in Leadbetter et al 10, we have

P

n

i1

X i ≤ u ni

−→ e −τ 3.24

Trang 8

Let Y n denote a random variable which has the same distribution as S n /σ n, but it is independent ofX1, X2, , X n , then by 3.10, we derive

P

n

i1

X i ≤ u ni , S σ n

n ≤ y

− P

n

i1

X i ≤ u ni

PY n ≤ y−→ 0, as n −→ ∞. 3.25

Thus, by the standard normal property ofY n, we have

lim

n → ∞ P

n

i1

X i ≤ u ni , S σ n

n ≤ y

 e −τΦ y, y ∈ R. 3.26

Hence, to complete the proof, it is sufficient to show

lim

n → ∞

1

logn

n



k1

1

k

I

k

i1

X i ≤ u ki , S σ k

k ≤ x

− P

k

i1

X i ≤ u ki , S σ k

k ≤ x



 0 a.s. 3.27

In order to show this, byLemma 3.1in Cs´aki and Gonchigdanzan5, we only need to prove

Var

1 logn

n



k1

1

k I

k

i1

X i ≤ u ki , S σ k

k ≤ x



log logn1 ε, 3.28

forε > 0 and any x ∈ R Let η k  I{k

i1 X i ≤ u ki , S k /σ k ≤ x}−P{k

i1 X i ≤ u ki , S k /σ k ≤ x}.

Then

Var

1 logn

n



k1

1

k I

k

i1

X i ≤ u ki , S σ k

k ≤ x



 E

1 logn

n



k1

1

k η k

2

 1 log2n

n



k1

1

k2E η k2 2

log2n



1≤k<l≤n

k η l

kl

: S1 S2.

3.29

Since|η k| ≤ 2, it follows that

S1 1 log2n . 3.30

Trang 9

Now, we turn to estimateS2 Observe that forl > k

k η l Cov

I

k

i1

{X i ≤ u ki }, S σ k

k ≤ x



, I

l

i1

{X i ≤ u li }, S σ l

l ≤ x





≤



Cov

I

k

i1

{X i ≤ u ki }, S σ k

k ≤ x



, I

l

i1

{X i ≤ u li }, S σ l

l ≤ x



−I

l

ik 1

{X i ≤ u li }, S σ l

l ≤ x



 



Cov

I

k

i1

{X i ≤ u ki }, S σ k

k ≤ x



, I

l

ik 1

{X i ≤ u li }, S σ l

l ≤ x





≤ E



I

l

i1

{X i ≤ u li }, S σ l

l ≤ x



− I

l

ik 1

{X i ≤ u li }, S σ l

l ≤ x



 



Cov

I

k

i1

{X i ≤ u ki }, S k

σ k ≤ x



, I

l

ik 1

{X i ≤ u li }, S l

σ l ≤ x





: S21 S22.

3.31

ByLemma 3.3, we have

S21≤ k l C

log logl1 ε. 3.32

UsingLemma 3.4, it follows that

S22≤



k l

logl1/2

log logl1 ε

C

log logl1 ε. 3.33

Hence forl > k, we have

k η l  ≤ k l C

log logl1 ε



k l

logl1/2

log logl1 ε. 3.34

Trang 10

S2 1

log2n

⎝ 

1≤k<l≤n

1

kl

⎝k

l



k l

logl1/2

log logl1 ε

⎠ 

1≤k<l≤n

1

kl log logl1 ε

1

log2n



1≤k<l≤n

1

l2 1 log2n

logn1/2

log logn1 ε

n



l2

1

l3/2

l−1



k1

1

k

1

log2n

n



l3

1

l log logl1 ε

l−1



k1

1

k

1

logn

1

 logn log logn1 ε

1 log2n

n



l3

logl

l log logl1 ε

1

logn

1 log logn1 ε.

3.35

Thus, we complete the proof of3.28 by 3.30 and 3.35 Further, our main result is proved

Acknowledgments

The author thanks the referees for pointing out some errors in a previous version, as well as for several comments that have led to improvements in this paper The authors would like to thank Professor Zuoxiang Peng of Southwest University in China for his help The paper has been supported by the young excellent talent foundation of Huaiyin Normal University

References

1 G A Brosamler, “An almost everywhere central limit theorem,” Mathematical Proceedings of the

Cambridge Philosophical Society, vol 104, no 3, pp 561–574, 1988.

2 P Schatte, “On strong versions of the central limit theorem,” Mathematische Nachrichten, vol 137, pp.

249–256, 1988

3 I Fahrner and U Stadtm ¨uller, “On almost sure max-limit theorems,” Statistics & Probability Letters,

vol 37, no 3, pp 229–236, 1998

4 S Cheng, L Peng, and Y Qi, “Almost sure convergence in extreme value theory,” Mathematische

Nachrichten, vol 190, pp 43–50, 1998.

5 E Cs´aki and K Gonchigdanzan, “Almost sure limit theorems for the maximum of stationary

Gaussian sequences,” Statistics & Probability Letters, vol 58, no 2, pp 195–203, 2002.

6 S Chen and Z Lin, “Almost sure max-limits for nonstationary Gaussian sequence,” Statistics &

Probability Letters, vol 76, no 11, pp 1175–1184, 2006.

7 M Dudzi´nski, “The almost sure central limit theorems in the joint version for the maxima and sums

of certain stationary Gaussian sequences,” Statistics & Probability Letters, vol 78, no 4, pp 347–357,

2008

8 M Dudzi´nski, “An almost sure limit theorem for the maxima and sums of stationary Gaussian

sequences,” Probability and Mathematical Statistics, vol 23, no 1, pp 139–152, 2003.

9 W V Li and Q Shao, “A normal comparison inequality and its applications,” Probability Theory and

Related Fields, vol 122, no 4, pp 494–508, 2002.

10 M R Leadbetter, G Lindgren, and H Rootz´en, Extremes and Related Properties of Random Sequences and

Processes, Springer Series in Statistics, Springer, New York, NY, USA, 1983.

...

Trang 6

Lemma 3.4 Let {Xn } be a sequence of standard Gaussian variables satisfying the... This lemma comes from Chen and Lin6

Trang 3

The following lemma isTheorem 2.1and Corollary 2.1... T3 T4.

3.19

Trang 7

UsingLemma 3.1, we have

log

Ngày đăng: 21/06/2014, 07:20

TỪ KHÓA LIÊN QUAN

TÀI LIỆU CÙNG NGƯỜI DÙNG

TÀI LIỆU LIÊN QUAN

🧩 Sản phẩm bạn có thể quan tâm