1. Trang chủ
  2. » Kỹ Thuật - Công Nghệ

Tài liệu Mạng lưới giao thông và đánh giá hiệu suất P8 pdf

21 367 0
Tài liệu đã được kiểm tra trùng lặp

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Tiêu đề Bounds on the Buffer Occupancy Probability with Self-Similar Input Traffic
Tác giả N. Likhanov
Người hướng dẫn Kihong Park, Editor, Walter Willinger, Editor
Chuyên ngành Computer Networks
Thể loại Chapter in a Book
Năm xuất bản 2000
Định dạng
Số trang 21
Dung lượng 145,98 KB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

One way is toconstruct the self-similar process as a sum of independent sources with a specialform of the autocorrelation function.. Compare with Chapter 9,Section 10.4 of Chapter 10, an

Trang 1

BOUNDS ON THE BUFFER

OCCUPANCY PROBABILITY WITH

SELF-SIMILAR INPUT TRAFFIC

N LIKHANOV

Institute for Problems of Information Transmission,

Russian Academy of Science, Moscow, Russia

8.1 INTRODUCTION

High-quality traf®c measurements indicate that actual traf®c behavior over speed networks shows self-similar features These include an analysis of hundreds ofmillions of observed packets on several Ethernet LANs [7, 8], and an analysis of afew million observed frame data by variable bit rate (VBR) video services [1] Inthese studies, packet traf®c appears to be statistically self-similar [2, 11] Self-similartraf®c is characterized by ``burstiness'' across an extremely wide range of time scales[7] This behavior of aggregate Ethernet traf®c is very different from conventionaltraf®c models (e.g., Poisson, batch Poisson, Markov modulated Poisson process [4])

A lot of studies have been made for the design, control, and performance of speed and cell-relay networks, using traditional traf®c models It is likely that many

high-of those results need major revision when self-similar traf®c models are considered[18]

Self-similarity manifests itself in a variety of different ways: a spectral densitythat diverges at the origin, a nonsummable autocorrelation function (indicating long-range dependence), an index of dispersion of counts (IDCs) that increases mono-tonically with the sample time T, and so on [7] A key parameter characterizing self-similar processes is the so-called Hurst parameter, H, which is designed to capturethe degree of self-similarity

Self-Similar Network Traf®c and Performance Evaluation, Edited by Kihong Park and Walter Willinger ISBN 0-471-31974-0 Copyright # 2000 by John Wiley & Sons, Inc.

193

Copyright # 2000 by John Wiley & Sons, Inc Print ISBN 0-471-31974-0 Electronic ISBN 0-471-20644-X

Trang 2

Self-similar process models can be derived in different ways One way is toconstruct the self-similar process as a sum of independent sources with a specialform of the autocorrelation function If we put the peak rate of each source going tozero as the number of sources goes to in®nity, models like those of Mandelbrot [11]and Taqqu and LeÂvy [15] will be obtained Queueing analysis for these kinds ofprocesses is given in Chapters 4 and 5 in this volume Another approach is toconsider on=off sources with constant peak rate, while the number of sources goes toin®nity In this way, we obtain self-similar processes with sessions arrived as Poissonr.v.'s [9] Originally this process was proposed by Cox [2] and queueing analysis wasdone recently by many authors [3, 6, 9, 10, 13, 17] The main results of these papersare presented in this volume In Chapter 9 we can ®nd a complete overview of thistopic Chapters 7 and 10, present some particular results for the above model as well

as results for the model with ®nite number of on=off sources Models close to on=offprocesses arrived as Poisson r.v.'s are considered in Chapter 11 In Chapter 6 aqueueing system with instantaneous arrivals is given

In this chapter we will ®nd the class of all self-similar processes with independentsessions arrived as Poission r.v.'s For the particular case of the Cox model, we will

®nd asymptotic bounds for buffer over¯ow probability Compare with Chapter 9,Section 10.4 of Chapter 10, and Section 7.4 of Chapter 7, where asymptotic boundsare presented for a wide class of processes beyond the self-similar one we will focus

on the self-similar case (Pareto distribution of the active period) For this case wepresent some new bounds, which are more accurate compared to the best-knowncurrent bounds [3, 10]

This chapter is organized in the following way First, we give the de®nition ofsecond-order self-similar traf®c and some well-known, but useful, relations betweenvariance, autocorrelation, and spectral density functions This is followed by aconstruction of a class of second-order self-similar processes Finally, asymptoticqueueing behavior for a particular form of the processes from our class is analyzed

8.2 SECOND-ORDER SELF-SIMILAR PROCESSES

We consider a discrete-time stationary stochastic process

i ˆ s2 The autocorrelation function of processes x and X is

r…k† ˆ Ef…Xi m†…Xi‡k m†g=s2ˆ Efxixi‡kg=s2; k ˆ 0; 1; 2; :Since the process X is a stationary one, r…k†, Var Xi, and EXido not depend on i

Trang 3

X…m†ˆ … ; X…m†1; X0…m†; X1…m†; †;

x…m†ˆ … ; x…m†1; x…m†0 ; x…m†1 ; †; m ˆ 1; 2; 3; ;where

r…m†ˆ Ef…Xi…m† m†…Xi‡k…m† m†g=Var Xi…m†ˆ Efx…m†i x…m†i‡kg=Var x…m†i ;

Pm jˆ1r…i j† ˆ s2 m ‡ 2m 1P

…Var x…m‡1†0 Var x…m†0 † …Var x…m†0 Var x…m 1†0 † ˆ 2s2r…m†:

If we de®ne Var x…0†i ˆ 0, then

r…m† ˆ2s12‰Var x…m‡1†0 2 Var x…m†0 ‡ Var x…m 1†0 Š: …8:2†

Trang 4

The same equation for r…k†…m† is

De®nition 8.2.1 A stationary process X ˆ … ; X 1; X0; X1; † with ®nite mean

m ˆ EXi< 1 and variance s2ˆ Var Xi< 1 is called exactly second-order similar with parameter 0 < b < 1, if

self-r…k† ˆ1

2‰…k ‡ 1†2 b 2k2 b‡ …k 1†2 bŠ …8:6†for all k ˆ 1; 2;

Parameter b and the Hurst parameter H are related as H ˆ 1 b=2,1

2< H < 1.Let us discuss the given de®nition Substituting Eq (8.6) into (8.1), it is easy tosee that for a self-similar process

Var x…m†0 ˆ s2m2 b; m ˆ 1; 2; : …8:7†From Eq (8.2) we can easily conclude that if Var x…m†0 ˆ s2m2 b, then theautocorrelation function r…k† satis®es Eq (8.6) This means that Eq (8.7) isequivalent to Eq (8.6) and can be used in the above de®nition instead of Eq (8.6).Substituting Eq (8.7) into (8.3) we get

r…k†…m† ˆ r…m†; m ˆ 1; 2; ; k ˆ 1; 2; : …8:8†

Trang 5

Equation (8.8) can clarify the sense of the de®nition of the self-similar process: theoriginal process X and its aggregated process X…m† have the same correlationstructure.

The spectral density function of the self-similar process was found by Sinai [14]:

It is easy to see from Eq (8.5) that, if process X has spectral density function (8.9),its variance will be equal to (8.7) As o ! 0, from Eq (8.9) we obtain

f …o†  const  o1 b We use f …x†  g…x† in the sense that limx!1…f …x†=g…x†† ˆ 1.Now we conclude that an exactly second-order self-similar process can be de®nedvia its autocorrelation function (8.6), variance of the sum (8.7), or spectral densityfunction (8.9), and all these de®nitions are equivalent Meanwhile, for the de®nition

of an asymptotically second-order self-similar process, it is important to de®ne thekind of characteristic to be used We will use the following de®nition

De®nition 8.2.2 A stationary process X ˆ … ; X 1; X0; X1; † with ®nite mean

m ˆ EXi< 1 and variance s2ˆ Var Xi< 1 is called asymptotically second-orderself-similar with parameter 0 < b < 1, if

lim

m!1 r…m†…k† ˆ1

2‰…k ‡ 1†2 b 2k2 b‡ …k 1†2 bŠ …8:10†for all k ˆ 1; 2;

The sense of this de®nition is that, for suf®ciently large m, the processes X…m†willhave the same autocorrelation function equal to (8.6)

As we can see from Eqs (8.1) and (8.5), Var X0…m† is a double integral of theautocorrelation function or an integral of the spectral density function Theserelations can clarify the behavior of r…k†, Var X0…m†, and f …o† obtained from thereal traf®c measurements Usually self-similarity of the data traf®c is established byanalyzing the traf®c variance Var X0…m† [5, 12], but the behavior of the varianceVar X0…m† can be close to a self-similar one, while the behavior of the autocorrelationcan be quite different from the theoretical one since the autocorrelation function is asecond derivative of the variance Var X0…m† It is also important that, as can be seenfrom Eq (8.5), even small harmonics at low frequency can have a dramatic in¯uence

on the behavior of Var X0…m† in a suf®cient region of m values

8.3 MODEL OF SELF-SIMILAR TRAFFIC

Consider a Poission process on the time axis with intensity l: Let

y ˆ … ; y 1; y0; y1; †;

Trang 6

where ytis the number of Poisson points in the interval ‰t; t ‡ 1†; t ˆ 0; 1; 2; The random variables yt are independent and identically distributed withPrfytˆ kg ˆ …lk=k!†e l.

Suppose that yt is the number of new active sources arriving to the system atmoment t For each source we assign a random variable tt;iÐthe length of the activeperiod, tt;i 2 f1; 2; g; t ˆ 0; 1; 2; ; i ˆ 1; 2; Ðand a random process

ct;i…n†Ðthe rate of cell generation during the active period at the moment n fromthe beginning of the period, ct;i…n† 2 R‡; n ˆ 0; 1; 2; The random couples

…tt;i; ct;i…n†† are independent and identically distributed and also independent ofthe process y We de®ne our process Y ˆ … ; Y 1; Y0; Y1; † as

Ytˆ4 Pt

kˆ 1

Py k

jˆ1ck;j…t k†I…ftk;j> t kg†; …8:11†where I…A† is an indicator function of the event,

M ! 1, so that M=…u ‡ Ett;i† ! l and Prfu < Mg ! 0, then the obtained processwill be statistically the same as the process Y

Let us calculate the mean, variance, and autocorrelation function of the process Y.From Eq (8.11), considering that the variance of the sum of independent randomvariables is equal to the sum of variance, we have

Trang 7

24

35

ˆ P1

mˆ0Prfy0ˆ mg‰m Var ki;1‡ m2…Eki;1†2 2SmEki;1‡ S2Š

ˆ l Var ki;1‡ l…Eki;1†2ˆ lEk2

where Prfy0ˆ mg is a Poisson distribution

From Eqs (8.12) and (8.13) we have

s2ˆ Var Ytˆ lP1

iˆ1Ek2 i;1ˆ lP1

iˆ1E‰c2 0;1…i 1†I…ft0;1> i 1g†Š:

Now we will calculate the autocorrelation function r…k† For any given t and kde®ne the following random variables:

Trang 8

We have

Ytˆ z ‡ Z1; Yt‡kˆ w ‡ Z2: …8:15†Since w; z are independent random variables and also independent from Z1; Z2,

r…k† ˆ 1

s2E‰…Yt EYt†…Yt‡k EYt‡k†Š ˆ 1

s2E‰…Z1 EZ1†…Z2 EZ2†Š:Using independence of Z1;j; Z2;i for i 6ˆ j we have

W1;j ˆ4 ci;j…t i†I…fti;j> t ‡ k ig†;

W2;j ˆ4 ci;j…t ‡ k i†I…fti;j > t ‡ k ig†:

Trang 9

Substituting Eq (8.18) into (8.17) and taking into account that y0 has a Poissondistribution, we get

E‰…Z1;i EZ1;i†…Z2;i EZ2;i†Š

ˆ P1

mˆ0Prfy0ˆ mg‰……m 1†m l2†EW1;1EW2;1‡ mE‰W1;1W2;1ŠŠ ˆ lE‰k1;1k2;1Š

ˆ lE‰ci;1…t i†ci;1…t ‡ k i†I…fti;1 > t ‡ k ig†Š:

Finally, using Eq (8.16), we have

iˆ0E‰c0;1…i†I…ft0;1> ig†Š: …8:20†

A self-similar process Y can be obtained from the following theorem

Theorem 8.3.1 Process Y ˆ … ; Y 1; Y0; Y1; † de®ned by Eq (8.11) with

®nite mean m ˆ EYt< 1 and variance s2ˆ Var Yt< 1 is exactly second-orderself-similar with parameter 0 < b < 1, if

P1 iˆ0E‰c0;1…i†c0;1…i ‡ k†I…ft0;1> i ‡ kg†Š

ˆs22l‰…k ‡ 1†2 b 2k2 b‡ …k 1†2 bŠ …8:21†for all k ˆ 1; 2;

Proof This theorem directly follows from the de®nition of a self-similarprocess (8.6) and the expression for the autocorrelation function (8.19) jCorollary 8.3.2 If a random process ck;j…i† is a constant one, ck;j…i†  ck;jfor all

i ˆ 0; 1; 2; , then a process Y ˆ … ; Y 1; Y0; Y1; † de®ned by Eq (8.11), with

®nite mean m ˆ EYt< 1 and variance s2 ˆ Var Yt< 1, will be exactly order self-similar with parameter 0 < b < 1, if

second-Prft0;1> kgE‰c2

0;1jt0;1> kŠ ˆ s2

2lD3……k 1†2 b†

ˆ4 s2l2‰…k ‡ 2†2 b 3…k ‡ 1†2 b‡ 3k2 b …k 1†2 bŠ …8:22†

Trang 10

for all k ˆ 0; 1; 2; , where, for convenience, we will use that … 1†2 bˆ4 1,

02 bˆ4 0

Proof Since c0;1…i† does not depend on i, we can write

E‰c0;1…i†c0;1…i ‡ k†I…ft0;1> i ‡ kg†Š ˆ Prft0;1> kgE‰c20;1jt0;1> kŠ:Substituting this into Eq (8.21) and subtracting Eqs (8.21) for k and k ‡ 1, we

crc…k† ˆ4 E‰…c0;1…i† mc†…c0;1…i ‡ k† mc†Š, mcˆ

Trang 11

In the case where the random process ck;j…i†  1, equations similar to (8.25) werefound by Cox [2].

Now we give some results for an asymptotically self-similar process Y.Theorem 8.3.4 Process Y ˆ … ; Y 1; Y0; Y1; † de®ned by Eq (8.11), with

®nite mean m ˆ EYt < 1 and variance s2 ˆ Var Yt< 1 is asymptotically order self-similar with parameter 0 < b < 1, if

where c1 and c2 are some positive constants and 0 < b < 1

Substituting Eq (8.27) into (8.3) we get

second-i ˆ 0; 1; 2; , then a process Y ˆ … ; Y 1; Y0; Y1; † de®ned by Eq (8.11), with

®nite mean m ˆ EYt< 1 and variance s2 ˆ Var Yt< 1, will be asymptoticallysecond-order self-similar with parameter 0 < b < 1, if

Prft0;1> kgE‰c20;1jt0;1Š > k  const  k 1 b; as k ! 1; …8:28†or

Prft0;1ˆ kgE‰c2

0;1jt0;1ˆ kŠ  const  k 2 b; as k ! 1: …8:29†

Trang 12

Proof Since c0;1…i† does not depend on i, from Eq (8.19) we have

r…k† ˆsl2P1

iˆ0Prft0;1> kgE‰c20;1jt0;1> kŠ:

Substituting Eq (8.28) in the above equation, we obtain

r…k†  const P1

iˆ0…i ‡ k† 1 b const  k b; as k ! 1:

Then from Theorem 8.3.4 it immediately follows that Y is an asymptotically order self-similar process

second-Statement (8.29) can be proved in the same way j

8.4 ASYMPTOTICAL BOUNDS FOR BUFFER OVERFLOW

PROBABILITY

In this section we consider the process Y de®ned by Eq (8.11) as the input traf®c of

a single server queueing system with constant server rate equal to C and in®nitebuffer size Suppose process Y has ®nite mean m ˆ EYt< C < 1 and ®nitevariance s2ˆ Var Yt< 1 We will consider the particular form of the process Y.Namely, we consider the case when the random process ci;j…t†  1 Let

Prft0;1ˆ ig  c0i 2 b …8:30†

as i ! 1, with 0 < b < 1 Then, according to Eq (8.29) with c0;1 1, process Ywill be asymptotically second-order self-similar with Hurst parameter H ˆ 1 b=2.Now we are interested in the queue length behavior Let nt be the length of thequeue at the moment t Then we have

ntˆ max…0; nt 1‡ Yt C†: …8:31†

We will estimate the probability Prfnt> zg, that is, the stationary probability to ®nd,

at moment t, the length of the queue bigger than z, for large value of z

For any given z, let us split the process Ytinto two processes Yt…1†and Yt…2†, that is,

Ytˆ Yt…1†‡ Yt…2†;

Trang 13

according to the following rule:

First, we derive an upper bound for the probability Prfnt > zg

Trang 14

Proof Let us estimate the probability

iˆ1tj;iI…tj;i Ez†; ~k ˆ4 k ‡ Ez:

Random variables qjare independent and identically distributed Denote via f…s† thelogarithm of the moment-generating function of the sum (8.36) Clearly,

Trang 15

As we can see from Eq (8.20) m ˆP1jˆ1ljj, so that PEzjˆ1ljj m < 0 and, hence,

jˆ1lj…sj†2esjP1‡P2;where

Trang 16

where we used that Ec1< b Substituting this in Eq (8.38) we have for suf®cientlylarge z

Pu e k=zz ~d2 c 1:Finally, we get

Pr max

k1fSk…2† …C m d1†kg > z…1 d2†

ˆ Prfn…2†t > z…1 d2†g;where

where m ˆ 1 ‡ b ~Cc, ~C ˆ C m d1, b ~Cc is the integer part of ~C, and

a ˆ ~C b ~Cc We will consider only the case when ~C 6ˆ b ~Cc, a > 0 For anygiven realization of the process Y, we have n…2†t  n…2†t It means that Prfn…2†t >

z…1 d2†g  Prfn…2†t > z…1 d2†g For given moment t, de®ne the moments

Trang 17

To analyze probability Prfn…2†t > ~zg, ~z ˆ4 z…1 d2†, we can write

Prfn…2†t > ~zg ˆ PrfB1[ B2; n…2†t > ~zg  PrfB2g ‡ PrfB1nB0g ‡ Prfn…2†t > ~z; B0g:First, let us estimate probability PrfB2g If lE‰t0;1I…t0;1> Ez†Š < 1, it can beshown that

as z ! 1 Applying this to Eq (8.39) we obtain

PrfB1g  const  z bm‡ o…z bm†; PrfB2g  const  z b…m‡1†‡ o…z b…m‡1††Next, we estimate probability

PrfB1nB0g  const  z bm b 2

‡ o…z bm b 2

†; as z ! 1:

Ngày đăng: 24/12/2013, 17:15

TỪ KHÓA LIÊN QUAN

🧩 Sản phẩm bạn có thể quan tâm

w