One way is toconstruct the self-similar process as a sum of independent sources with a specialform of the autocorrelation function.. Compare with Chapter 9,Section 10.4 of Chapter 10, an
Trang 1BOUNDS ON THE BUFFER
OCCUPANCY PROBABILITY WITH
SELF-SIMILAR INPUT TRAFFIC
N LIKHANOV
Institute for Problems of Information Transmission,
Russian Academy of Science, Moscow, Russia
8.1 INTRODUCTION
High-quality traf®c measurements indicate that actual traf®c behavior over speed networks shows self-similar features These include an analysis of hundreds ofmillions of observed packets on several Ethernet LANs [7, 8], and an analysis of afew million observed frame data by variable bit rate (VBR) video services [1] Inthese studies, packet traf®c appears to be statistically self-similar [2, 11] Self-similartraf®c is characterized by ``burstiness'' across an extremely wide range of time scales[7] This behavior of aggregate Ethernet traf®c is very different from conventionaltraf®c models (e.g., Poisson, batch Poisson, Markov modulated Poisson process [4])
A lot of studies have been made for the design, control, and performance of speed and cell-relay networks, using traditional traf®c models It is likely that many
high-of those results need major revision when self-similar traf®c models are considered[18]
Self-similarity manifests itself in a variety of different ways: a spectral densitythat diverges at the origin, a nonsummable autocorrelation function (indicating long-range dependence), an index of dispersion of counts (IDCs) that increases mono-tonically with the sample time T, and so on [7] A key parameter characterizing self-similar processes is the so-called Hurst parameter, H, which is designed to capturethe degree of self-similarity
Self-Similar Network Traf®c and Performance Evaluation, Edited by Kihong Park and Walter Willinger ISBN 0-471-31974-0 Copyright # 2000 by John Wiley & Sons, Inc.
193
Copyright # 2000 by John Wiley & Sons, Inc Print ISBN 0-471-31974-0 Electronic ISBN 0-471-20644-X
Trang 2Self-similar process models can be derived in different ways One way is toconstruct the self-similar process as a sum of independent sources with a specialform of the autocorrelation function If we put the peak rate of each source going tozero as the number of sources goes to in®nity, models like those of Mandelbrot [11]and Taqqu and LeÂvy [15] will be obtained Queueing analysis for these kinds ofprocesses is given in Chapters 4 and 5 in this volume Another approach is toconsider on=off sources with constant peak rate, while the number of sources goes toin®nity In this way, we obtain self-similar processes with sessions arrived as Poissonr.v.'s [9] Originally this process was proposed by Cox [2] and queueing analysis wasdone recently by many authors [3, 6, 9, 10, 13, 17] The main results of these papersare presented in this volume In Chapter 9 we can ®nd a complete overview of thistopic Chapters 7 and 10, present some particular results for the above model as well
as results for the model with ®nite number of on=off sources Models close to on=offprocesses arrived as Poisson r.v.'s are considered in Chapter 11 In Chapter 6 aqueueing system with instantaneous arrivals is given
In this chapter we will ®nd the class of all self-similar processes with independentsessions arrived as Poission r.v.'s For the particular case of the Cox model, we will
®nd asymptotic bounds for buffer over¯ow probability Compare with Chapter 9,Section 10.4 of Chapter 10, and Section 7.4 of Chapter 7, where asymptotic boundsare presented for a wide class of processes beyond the self-similar one we will focus
on the self-similar case (Pareto distribution of the active period) For this case wepresent some new bounds, which are more accurate compared to the best-knowncurrent bounds [3, 10]
This chapter is organized in the following way First, we give the de®nition ofsecond-order self-similar traf®c and some well-known, but useful, relations betweenvariance, autocorrelation, and spectral density functions This is followed by aconstruction of a class of second-order self-similar processes Finally, asymptoticqueueing behavior for a particular form of the processes from our class is analyzed
8.2 SECOND-ORDER SELF-SIMILAR PROCESSES
We consider a discrete-time stationary stochastic process
i s2 The autocorrelation function of processes x and X is
r k Ef Xi m Xik mg=s2 Efxixikg=s2; k 0; 1; 2; :Since the process X is a stationary one, r k, Var Xi, and EXido not depend on i
Trang 3X m ; X m1; X0 m; X1 m; ;
x m ; x m1; x m0 ; x m1 ; ; m 1; 2; 3; ;where
r m Ef Xi m m Xik m mg=Var Xi m Efx mi x mikg=Var x mi ;
Pm j1r i j s2 m 2m 1P
Var x m10 Var x m0 Var x m0 Var x m 10 2s2r m:
If we de®ne Var x 0i 0, then
r m 2s12Var x m10 2 Var x m0 Var x m 10 : 8:2
Trang 4The same equation for r k m is
De®nition 8.2.1 A stationary process X ; X 1; X0; X1; with ®nite mean
m EXi< 1 and variance s2 Var Xi< 1 is called exactly second-order similar with parameter 0 < b < 1, if
self-r k 1
2 k 12 b 2k2 b k 12 b 8:6for all k 1; 2;
Parameter b and the Hurst parameter H are related as H 1 b=2,1
2< H < 1.Let us discuss the given de®nition Substituting Eq (8.6) into (8.1), it is easy tosee that for a self-similar process
Var x m0 s2m2 b; m 1; 2; : 8:7From Eq (8.2) we can easily conclude that if Var x m0 s2m2 b, then theautocorrelation function r k satis®es Eq (8.6) This means that Eq (8.7) isequivalent to Eq (8.6) and can be used in the above de®nition instead of Eq (8.6).Substituting Eq (8.7) into (8.3) we get
r k m r m; m 1; 2; ; k 1; 2; : 8:8
Trang 5Equation (8.8) can clarify the sense of the de®nition of the self-similar process: theoriginal process X and its aggregated process X m have the same correlationstructure.
The spectral density function of the self-similar process was found by Sinai [14]:
It is easy to see from Eq (8.5) that, if process X has spectral density function (8.9),its variance will be equal to (8.7) As o ! 0, from Eq (8.9) we obtain
f o const o1 b We use f x g x in the sense that limx!1 f x=g x 1.Now we conclude that an exactly second-order self-similar process can be de®nedvia its autocorrelation function (8.6), variance of the sum (8.7), or spectral densityfunction (8.9), and all these de®nitions are equivalent Meanwhile, for the de®nition
of an asymptotically second-order self-similar process, it is important to de®ne thekind of characteristic to be used We will use the following de®nition
De®nition 8.2.2 A stationary process X ; X 1; X0; X1; with ®nite mean
m EXi< 1 and variance s2 Var Xi< 1 is called asymptotically second-orderself-similar with parameter 0 < b < 1, if
lim
m!1 r m k 1
2 k 12 b 2k2 b k 12 b 8:10for all k 1; 2;
The sense of this de®nition is that, for suf®ciently large m, the processes X mwillhave the same autocorrelation function equal to (8.6)
As we can see from Eqs (8.1) and (8.5), Var X0 m is a double integral of theautocorrelation function or an integral of the spectral density function Theserelations can clarify the behavior of r k, Var X0 m, and f o obtained from thereal traf®c measurements Usually self-similarity of the data traf®c is established byanalyzing the traf®c variance Var X0 m [5, 12], but the behavior of the varianceVar X0 m can be close to a self-similar one, while the behavior of the autocorrelationcan be quite different from the theoretical one since the autocorrelation function is asecond derivative of the variance Var X0 m It is also important that, as can be seenfrom Eq (8.5), even small harmonics at low frequency can have a dramatic in¯uence
on the behavior of Var X0 m in a suf®cient region of m values
8.3 MODEL OF SELF-SIMILAR TRAFFIC
Consider a Poission process on the time axis with intensity l: Let
y ; y 1; y0; y1; ;
Trang 6where ytis the number of Poisson points in the interval t; t 1; t 0; 1; 2; The random variables yt are independent and identically distributed withPrfyt kg lk=k!e l.
Suppose that yt is the number of new active sources arriving to the system atmoment t For each source we assign a random variable tt;iÐthe length of the activeperiod, tt;i 2 f1; 2; g; t 0; 1; 2; ; i 1; 2; Ðand a random process
ct;i nÐthe rate of cell generation during the active period at the moment n fromthe beginning of the period, ct;i n 2 R; n 0; 1; 2; The random couples
tt;i; ct;i n are independent and identically distributed and also independent ofthe process y We de®ne our process Y ; Y 1; Y0; Y1; as
Yt4 Pt
k 1
Py k
j1ck;j t kI ftk;j> t kg; 8:11where I A is an indicator function of the event,
M ! 1, so that M= u Ett;i ! l and Prfu < Mg ! 0, then the obtained processwill be statistically the same as the process Y
Let us calculate the mean, variance, and autocorrelation function of the process Y.From Eq (8.11), considering that the variance of the sum of independent randomvariables is equal to the sum of variance, we have
Trang 724
35
P1
m0Prfy0 mgm Var ki;1 m2 Eki;12 2SmEki;1 S2
l Var ki;1 l Eki;12 lEk2
where Prfy0 mg is a Poisson distribution
From Eqs (8.12) and (8.13) we have
s2 Var Yt lP1
i1Ek2 i;1 lP1
i1Ec2 0;1 i 1I ft0;1> i 1g:
Now we will calculate the autocorrelation function r k For any given t and kde®ne the following random variables:
Trang 8We have
Yt z Z1; Ytk w Z2: 8:15Since w; z are independent random variables and also independent from Z1; Z2,
r k 1
s2E Yt EYt Ytk EYtk 1
s2E Z1 EZ1 Z2 EZ2:Using independence of Z1;j; Z2;i for i 6 j we have
W1;j 4 ci;j t iI fti;j> t k ig;
W2;j 4 ci;j t k iI fti;j > t k ig:
Trang 9Substituting Eq (8.18) into (8.17) and taking into account that y0 has a Poissondistribution, we get
E Z1;i EZ1;i Z2;i EZ2;i
P1
m0Prfy0 mg m 1m l2EW1;1EW2;1 mEW1;1W2;1 lEk1;1k2;1
lEci;1 t ici;1 t k iI fti;1 > t k ig:
Finally, using Eq (8.16), we have
i0Ec0;1 iI ft0;1> ig: 8:20
A self-similar process Y can be obtained from the following theorem
Theorem 8.3.1 Process Y ; Y 1; Y0; Y1; de®ned by Eq (8.11) with
®nite mean m EYt< 1 and variance s2 Var Yt< 1 is exactly second-orderself-similar with parameter 0 < b < 1, if
P1 i0Ec0;1 ic0;1 i kI ft0;1> i kg
s22l k 12 b 2k2 b k 12 b 8:21for all k 1; 2;
Proof This theorem directly follows from the de®nition of a self-similarprocess (8.6) and the expression for the autocorrelation function (8.19) jCorollary 8.3.2 If a random process ck;j i is a constant one, ck;j i ck;jfor all
i 0; 1; 2; , then a process Y ; Y 1; Y0; Y1; de®ned by Eq (8.11), with
®nite mean m EYt< 1 and variance s2 Var Yt< 1, will be exactly order self-similar with parameter 0 < b < 1, if
second-Prft0;1> kgEc2
0;1jt0;1> k s2
2lD3 k 12 b
4 s2l2 k 22 b 3 k 12 b 3k2 b k 12 b 8:22
Trang 10for all k 0; 1; 2; , where, for convenience, we will use that 12 b4 1,
02 b4 0
Proof Since c0;1 i does not depend on i, we can write
Ec0;1 ic0;1 i kI ft0;1> i kg Prft0;1> kgEc20;1jt0;1> k:Substituting this into Eq (8.21) and subtracting Eqs (8.21) for k and k 1, we
crc k 4 E c0;1 i mc c0;1 i k mc, mc
Trang 11In the case where the random process ck;j i 1, equations similar to (8.25) werefound by Cox [2].
Now we give some results for an asymptotically self-similar process Y.Theorem 8.3.4 Process Y ; Y 1; Y0; Y1; de®ned by Eq (8.11), with
®nite mean m EYt < 1 and variance s2 Var Yt< 1 is asymptotically order self-similar with parameter 0 < b < 1, if
where c1 and c2 are some positive constants and 0 < b < 1
Substituting Eq (8.27) into (8.3) we get
second-i 0; 1; 2; , then a process Y ; Y 1; Y0; Y1; de®ned by Eq (8.11), with
®nite mean m EYt< 1 and variance s2 Var Yt< 1, will be asymptoticallysecond-order self-similar with parameter 0 < b < 1, if
Prft0;1> kgEc20;1jt0;1 > k const k 1 b; as k ! 1; 8:28or
Prft0;1 kgEc2
0;1jt0;1 k const k 2 b; as k ! 1: 8:29
Trang 12Proof Since c0;1 i does not depend on i, from Eq (8.19) we have
r k sl2P1
i0Prft0;1> kgEc20;1jt0;1> k:
Substituting Eq (8.28) in the above equation, we obtain
r k const P1
i0 i k 1 b const k b; as k ! 1:
Then from Theorem 8.3.4 it immediately follows that Y is an asymptotically order self-similar process
second-Statement (8.29) can be proved in the same way j
8.4 ASYMPTOTICAL BOUNDS FOR BUFFER OVERFLOW
PROBABILITY
In this section we consider the process Y de®ned by Eq (8.11) as the input traf®c of
a single server queueing system with constant server rate equal to C and in®nitebuffer size Suppose process Y has ®nite mean m EYt< C < 1 and ®nitevariance s2 Var Yt< 1 We will consider the particular form of the process Y.Namely, we consider the case when the random process ci;j t 1 Let
Prft0;1 ig c0i 2 b 8:30
as i ! 1, with 0 < b < 1 Then, according to Eq (8.29) with c0;1 1, process Ywill be asymptotically second-order self-similar with Hurst parameter H 1 b=2.Now we are interested in the queue length behavior Let nt be the length of thequeue at the moment t Then we have
nt max 0; nt 1 Yt C: 8:31
We will estimate the probability Prfnt> zg, that is, the stationary probability to ®nd,
at moment t, the length of the queue bigger than z, for large value of z
For any given z, let us split the process Ytinto two processes Yt 1and Yt 2, that is,
Yt Yt 1 Yt 2;
Trang 13according to the following rule:
First, we derive an upper bound for the probability Prfnt > zg
Trang 14Proof Let us estimate the probability
i1tj;iI tj;i Ez; ~k 4 k Ez:
Random variables qjare independent and identically distributed Denote via f s thelogarithm of the moment-generating function of the sum (8.36) Clearly,
Trang 15As we can see from Eq (8.20) m P1j1ljj, so that PEzj1ljj m < 0 and, hence,
j1lj sj2esjP1P2;where
Trang 16where we used that Ec1< b Substituting this in Eq (8.38) we have for suf®cientlylarge z
Pu e k=zz ~d2 c 1:Finally, we get
Pr max
k1fSk 2 C m d1kg > z 1 d2
Prfn 2t > z 1 d2g;where
where m 1 b ~Cc, ~C C m d1, b ~Cc is the integer part of ~C, and
a ~C b ~Cc We will consider only the case when ~C 6 b ~Cc, a > 0 For anygiven realization of the process Y, we have n 2t n 2t It means that Prfn 2t >
z 1 d2g Prfn 2t > z 1 d2g For given moment t, de®ne the moments
Trang 17To analyze probability Prfn 2t > ~zg, ~z 4 z 1 d2, we can write
Prfn 2t > ~zg PrfB1[ B2; n 2t > ~zg PrfB2g PrfB1nB0g Prfn 2t > ~z; B0g:First, let us estimate probability PrfB2g If lEt0;1I t0;1> Ez < 1, it can beshown that
as z ! 1 Applying this to Eq (8.39) we obtain
PrfB1g const z bm o z bm; PrfB2g const z b m1 o z b m1Next, we estimate probability
PrfB1nB0g const z bm b 2
o z bm b 2
; as z ! 1: