1. Trang chủ
  2. » Công Nghệ Thông Tin

Networking Theory and Fundamentals - Lecture 1 potx

75 174 0
Tài liệu đã được kiểm tra trùng lặp

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Tiêu đề Delay Models in the Network Layer
Tác giả Santosh S. Venkatesh
Trường học University of Pennsylvania
Chuyên ngành Networking
Thể loại Class Notes
Năm xuất bản 1997
Thành phố Philadelphia
Định dạng
Số trang 75
Dung lượng 431,42 KB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

N customers in system T time per customer Closed queueing environment in steady state Customer arrivals rate λ Customerdepartures The key parameters characterising the system are: • λ—th

Trang 1

Delay Models in the Network

Layer

It is important in data communication settings to be able to characterise the

delay that individual packets are subjected to in their sojourn through the

system Delays can be traced to four main causes

• Processing delay: This is the time it takes to process a frame at each

subnet node and prepare it for retransmission These delays are

deter-mined by the complexity of the protocol and the computational power

available at each subnet node

• Propagation delay: This is the time it actually takes a frame to

prop-agate through the communication link These delays are dictated by

the distance or length of the communication pathway They can be

significant, for instance, in satellite links and in high-speed links when

the propagation delay can be a significant fraction of the overall delay

• Transmission delay: This is the time it takes to transmit an entire

frame, from first bit to last bit, into a communication link These

delays are dictated primarily by link speed, for example, a 9600 bps

link occasions only half the delay of a4800 bps link

• Queueing delay: This is the time a frame has to wait in a queue for

ser-vice These delays are occasioned by congestion at the subnet nodes

Propagation delays are determined by the physical channels and are

inde-pendent of the actual traffic patterns in the link; likewise, processing delays

are determined by the available hardware and again are not affected by

traf-fic We hence focus on the transmission and queueing delays endemic at

any subnet node The discussion ignores the possibility of retransmissions,

which of course add to the overall delay In practice, however,

retransmis-sions are rare in many data networks so that this assumption may safely be

Trang 2

Subnet node Packets arrive

asynchronously

Packets depart asynchronously

Queues are intrinsic topacket-switched networks In ageneric packet-switching network,frames arrive asynchronously atsubnet nodes in the communica-tion subnet and are processed andreleased according to some ser-vice protocol, for instance, a first-

in, first-out (FIFO) or first-come, first-served (FCFS) protocol Typically, asubnet node cannot handle all the traffic entering it simultaneously so thatframes arriving at the node are buffered to await their turn for transmis-sion In queueing parlance, the frames are “customers” awaiting “service”from the subnet node which is hence the “server.”

The overall queueing delay for a frame (customer) is determinedgenerally by the congestion at the subnet node (server) which is governed

by packet arrival statistics and the service discipline in force

Arriving packets

Departing packets Buffer Server

Multi-server queue Single-server queue

More specifically, node congestion is influenced by the following factors:

• Arrival statistics embodied in the distribution of customer interarrival

times τ (The arrival rate λ = 1/ E(τ) will play a critical rˆole in thesequel.)

• Buffer size m—for instance, finite buffer systems (m < ∞) in which

customers are turned away if the queue is full, and infinite buffer tems (m =∞) in which customers are always accepted into the system

sys-1 Multi-access networks are the exception which proves the rule: retransmissions are the rule rather than the exception in multi-access settings.

Trang 3

• Number of servers n—for instance, single-server systems (n = 1),

multi-server systems (n≥ 2), and infinite-server systems (n =∞)

• Service statistics embodied in the distribution of customer service times

X (The mean service time E(X) = 1/µ, in particular, will prove to be a

particularly useful concept in the sequel.)

Together, these determine the delay faced by each customer

A succinct notation has evolved to describe the various factors that

affect the congestion and delays in a queueing environment A generic

queueing environment is described in the form A/B/c/d, where the first two

descriptors A and B connote the arrival and service statistics, respectively,

and the second two descriptors c and d (if present) connote the number

of servers and the system capacity (buffer size or maximal number of

cus-tomers in the system), respectively The descriptors A and B are specified

by the arrival and service statistics of the queueing discipline and may be

specified, for instance, as M (exponential or memoryless distribution), E

(Er-langian distribution), H (hyperexponential distribution), D (deterministic),

or G (general distribution), to mention a few possibilities The descriptors

c and d take on positive integer values, allowing infinity as a possible value

to admit limiting systems with an infinite number of servers and/or infinite

storage capacity which are of theoretical interest

Little’s Theorem

Consider a queueing environment which, after initial transients have died

down, is operating in a stable steady state

N customers in system

T time per customer

(Closed) queueing environment in steady state

Customer

arrivals

(rate λ)

Customerdepartures

The key parameters characterising the system are:

• λ—the mean, steady state customer arrival rate.

• N—the average number of customers in the system (both in the buffer

and in-service)

Trang 4

• T—the mean time spent by each customer in the system (time spent in

the queue together with the service time)

It is tempting and intuitive to hazard the guess

N = λT

This indeed is the content of Little’s theorem which holds very generally for

a very wide range of service disciplines and arrival statistics

To motivate the result, consider a general, stable queueing ronment which has customers arriving in accordance with some underlyingarrival statistics and departing regulated by some given service discipline.Arrival, Departure Processes At any instant of timet, let A(t) and D(t)denote the number of arrivals and departures, respectively, in the time in-terval [0, t] The random processes A(t) and D(t), called the arrival and de-parture process, respectively, are governed by the probability distributionsunderlying customer arrivals and service provision, and provide a compre-hensive instantaneous description of the state of the system at any timet

envi-t

t N(t) = A(t) - D(t) = # customers in system at time t

A(t) D(t)

Sample Function -> Step Function Cadlag

Contin ue ‘a droi limite ‘a gauch

N(t)

The adjoining figure trates sample functions of the ar-rival and departure processes Ob-serve that the sample functions ofthe processes A(t) and D(t) arestep functions with jumps of unitheight Indeed, both processes

illus-are counting processes with the

jump points indicating an arrival

or departure Observe further thatthe sample functions of the depar-ture process lag the correspond-ing sample functions of the ar-rival process The random processN(t) = A(t) − D(t) then representsthe instantaneous number of cus-tomers present in the queueing environment (both in-queue and in-service)

at timet

Example 1 Transmission line.

Consider a transmission line system with packets arriving everyKseconds for transmission

Transmissio n l ine Arriving

packets

Departing packets

Trang 5

Suppose that each packet requires a transmission time ofaK seconds (a <

1) and that the propagation time for each packet is bK seconds Viewing the

transmission line as a server and packets as customers, customers arrive at

a fixed rateλ = 1/K packets/second Each customer spends a fixed amount

of time in the system T = (a + b)K The number of packets that enter

the line before a given packet departs is hence N = T/K = a + b This is

the average number of packets in the line in the steady state The arrival

processA(t) is a fixed time function which has regular unit jumps every K

seconds; the departure processD(t) is also a fixed time function with jump

points regulated by the valuea+b Two cases, 0 < a+b < 1 and 1 < a+b < 2

What kind of queueing discipline is this? Observe that the

“cus-tomers” (i.e., packets) arrive at a fixed rateλ = 1/K, promptly go into service

(i.e., have transmission initiated) as soon as they arrive regardless of how

many packets are already in the line,

Time Averages For definiteness, consider a FIFO system where customers

are served sequentially in order of arrival Typical sample functions for the

arrival and departure process in

such a system are shown alongside

The time average (up tot) of the

in-stantaneous number of customers

in the closed queueing environment

under consideration is given by

Now pick any instantt at which the

system has just become empty so

thatN(t) = A(t)−D(t) = 0 Let Tidenote the time spent in the system by the

ith customer as shown in the above figure for a FIFO system Now observe

Trang 6

via the figure that the area under the curve (up tot) of the instantaneousnumber of customers in the system is identically equal to the area betweenthe arrival and departure sample functions which, in turn, is comprised

of A(t) rectangular blocks of unit height and widths T1, , TA(t) Moreformally,

Zt

0N(τ) dτ =

Zt

0

hA(τ) − D(τ)

i

dτ =A(t)Xi=1Ti

Consequently, we obtain

^N(t) = 1

t

A(t)Xi=1

Ti= A(t)

t

! 1A(t)

A(t)Xi=1Ti

!

The first quantity on the right-hand side may be identified with the averagearrival rate of customers up to timet:

^λ(t) =A(t)

t .Likewise, the second term on the right-hand side may be identified with theaverage time spent by the firstA(t) customers in the system:

^

T (t) = 1A(t)

A(t)Xi=1

Ti

Thus, we obtain

^N(t) = ^λ(t)^T (t)

Queueing Environment

Now suppose that, as t → ∞,the various time averages tend tofixed steady state values ^N(t)→ N,

^λ(t) → λ, and ^T(t) → T We thenobtain Little’s formula

N = λTwhere we interpretN as the steadystate, average number of customers

in the system,λ as the steady statearrival rate of customers into thesystem, and T as the steady state,average time spent in the system by each customer

Trang 7

It may be remarked that the formulation, while derived for a FIFO

system, is actually very general and in fact applies to a very wide spectrum

of queueing environments and service disciplines

Ensemble Averages Little’s theorem extends quite naturally to situations

where the arrival and departure processes,A(t) and D(t), are specified by

some underlying probability law Indeed, suppose that at timet the number

of customersN(t) in the system has distribution πn(t) Then, the expected

number of customers in the system at timet is given by

Now, for a large number of systems, the distribution of customers tends to

a steady state or stationary distributionπn(t)→ πn ast→ ∞ Then,

¯N(t)→ ¯N =

∞Xn=0

A(t)

t → ¯λ (t→ ∞)

A fundamental result known as the ergodic theorem allows us to relate these

steady state ensemble averages to the time averages obtained from

individ-ual sample functions: for a very wide range of situations, including most

situations encountered in practice,N = ¯N, T = ¯T , and λ = ¯λ with probability

one Thus, we can heretofore apply Little’s theorem with confidence to any

closed queueing environment where we interpret the various quantities as

steady state, long-term averages

Applications of Little’s Theorem

The power of Little’s theorem is in its very wide applicability though care

should be taken to make sure that it is applied in the context of a

sta-ble, closed queueing environment where the mean arrival rate of customers

matches the mean departure rate and the system is operating in the steady

state Some examples may serve to fix the result

Trang 8

Example 2 Transmission line, reprise.

Suppose, as before, that packets arrive at regular intervals ofK onds to a transmission line As before, the transmission time for eachpacket isaK (where 0 < a < 1) and the propagation delay is bK (for somepositiveb)

sec-Transmissio nlineArriving

packets

Departing packets

The queueing environment consists of a single server (the mission line) Packets arrive at a rateλ = 1/K packets/second, with eachpacket staying in the system a total of T = aK + bK seconds Little’s the-orem hence shows that the steady-state average number of packets in thesystem isN = λT = a + b Recall that the actual number of packets in thesystem is a periodically varying, deterministic function so that the number

trans-of customers in the system never converges, even in the limit trans-of very largetime, to the constantN (cf Example 1) The long-term average number of

customers in the system, however, tends toN

Example 3Access-controlled highway entry.

If, in the previous transmission line example, we view the tions of transmission and propagation in reverse order, we obtain a relatedservice discipline

opera-Arrival rate of cars

to highway access λγ

Metering delay bK

Merging delay aK

In this alternative setting, customers arrive odically every K seconds, i.e., the customer ar-rival rate is λ = 1/K, wait in queue for bK sec-onds, and are serviced and released in aK sec-onds We may identify this queueing environmentwith an (idealised) access-controlled highway en-try system where traffic is permitted to enter ahighway access road at a fixed rate of λ = 1/Kcars per second The cars wait in a queue of fixedsize (fixed buffer size) and are released periodi-cally, one at a time, by a stop light metering sys-tem; the wait time in queue isbK seconds Finally,

peri-on release, the car at the head of the queue moves at a fixed rate over theaccess road to merge with traffic in the highway; the time taken to traversethis segment isaK seconds

Observe that Little’s theorem applied to the queueing segment aloneshows that the average number of customers waiting in queue isbK/K = b,while Little’s theorem applied to the service segment yields the (fractional)average number of customers being serviced at any instantaK/K = a Ap-plying Little’s theorem to the entire system consisting of queue and server

Trang 9

yields the average number of customers in the system N = (aK + bK)/K =

a + b which is the sum of the number of customers in the two segments of

the service discipline, as it should be

Observe that, while the physical details are rather different,

macro-scopically the two service disciplines are equivalent—at least in the

average-sense, long term world view of Little’s theorem

Example 4Airline counters.

Consider a closed queueing environment in which customers

enter-ing the system wait in a senter-ingle queue for service fromn service agents The

customer at the head of the queue proceeds for service to the first available

server who immediately begins service for the new customer as soon as the

previous customer has departed The average service time for a customer is

X

1

n

(2) (1)

λγ

Suppose that the the environment

has a finite capacity and that, at any given

moment, there are no more than N

cus-tomers in the system (For instance, we

may consider an idealised airport counter

type of environment in a room with

fi-nite capacity or a fixed length serpentine

queue with a fixed winding roped-off

bor-der.) Suppose additionally that demand

is such that any departing customer is

instantly replaced by another customer

(All holiday airfares are advertised at

half-price.) What is the average time a customer can expect to spend in the

system once he enters it?

Letλ denote the steady state mean arrival rate of customers into the

system andT the average time spent in the system by each customer Little’s

theorem applied to the entire system (queueing environment A) yieldsN =

λT , while an application of the theorem to the subsystem consisting only

of the n servers (queueing environment B) yields n = λX (Observe that, in

steady state, the arrival rate to the system of servers must be exactly the

same as the arrival rate into the system, else instability results.) It follows

that the average time spent in the system by each customer is

Intuitive and satisfactory Indeed, the form of the result is quite suggestive

Consider the following variation

Probabilistic airline counters In a probabilistic variation, consider, as before,

a finite capacity system which can accommodate at mostN customers at a

time with service being provided byn servers The average service time per

Trang 10

customer is ¯X Suppose that there is constant demand so that the system

is always full with each departing customer being promptly replaced by

an arriving customer In this version of the airport counter problem, eachserver has her own queue of customers A customer entering the systemjoins a queue uniformly at random and awaits his turn for service What isthe average time spent in the system by a customer?

(1)

(2)

1

n λγ

Consider the ith subsystem sisting of the ith server with her associ-ated queue and let Ni, λi, andTi denotethe mean number of customers in theithsubsystem, the arrival rate into the sub-system, and the mean time spent in thesubsystem by a customer, respectively

con-Then, Little’s theorem applied to the system yields Ti = Ni/λi To determinethe mean arrival rate into theith subsys-tem, consider the queueing environmentconsisting of theith server alone In thesteady state, suppose that the server is busy a fraction ρi of the time or,equivalently, is idle for a fraction1 − ρi of the time Little’s theorem thenyields ρi = λiX The quantity ρi connotes the mean utilisation factor for

sub-server i in the steady state How then does one go about estimating themean utilisation factor? The ergodic theorem tells us that, with probabilityone in the steady state, the fraction of time that a server is idle is iden-tically the instantaneous probability (at an arbitrary point in time in thesteady state) that the server is idle, i.e., has no customers in service Write

π0for the steady state instantaneous probability that theith server is idle

It follows thatρi = 1 − π0so thatTi = NiX

Trang 11

lead-to the mean time that a server is idle Contrariwise, in the original system

leading to (∗), all servers are constantly busy The gentle reader should be

able to make the correspondence between these two systems and statistical

multiplexing (wherein system resources are maximally utilised) and

time-or frequency-division multiplexing (wherein resources are allocated ahead

of time)

Example 5Supermarket counters.

Consider an n server system in which each server has a dedicated

queue of customers to service

cus-Little’s theorem applied to theithsubsystem now yields Ni = λiTi Now,let N = Pn

i=1Ni, λ = Pn

i=1λi, andT note, respectively, the mean number ofcustomers in the entire system, the totalcustomer arrival rate into the system, and the average time spent by a cus-

de-tomer in the system, respectively Little’s theorem applied to the whole

system hence yields

T = N

λ =

nXi=1

λi

Pnj=1λjTi.

Probabilistic interpretation The expression for the average customer delay

has an intuitive probabilistic interpretation Suppose each customer

ran-domly selects a server queue, picking theith server queue with probability

pi =λi Pn

j=1λj Then the average customer waiting time at the counter

before completion of service is

T = E{pi}(T ) =

nXi=1piTi=

nXi=1

λi

Pnj=1λjTi,which is the same result obtained before

Example 6Transmission line polling system.

Consider a polling system in whichm users time share a

transmis-sion line A switching mechanism is put into place whereby the line first

Trang 12

transmits some packets from user1, then proceeds to transmit some ets from user2, and continues likewise until it transmits some packets fromuserm The line then returns to user 1 in round-robin fashion and repeatsthe process.

pack-(m)

(2) (1)

Overhead for user m

Suppose user i has packetsarriving at a rate λi, each packetdemanding an average transmissiontime Xi In addition, suppose thatwhen the line switches to useri, an av-erage negotiation or overhead period

Ai is required by the line to adjust tothe new user before it can start trans-mitting packets from the user What

is the average round-robin cycle durationL before the line switches back to

view-to begin a new cycle We may hence model thetransmission line as a single-server queueingsystem with varying average service times de-pending on the nature of the user:

X =

±

Ai for overhead packets ofith user,

Xi for real packets ofith user.Equivalently, service may be broken down into

a system of 2m servers where servers taketurns cyclically and sequentially with serveri0taking care of the overhead period for customeri (i.e., virtual customer i0and serveri taking care of customer i Let Ni0 and Ni denote the steady-state average, instantaneous number of customers seen by serversi0andi,

Trang 13

respectively At any given moment, precisely one customer is in service and

the remaining2m − 1 servers are idle It follows thatP

i(Ni0+Ni) =1 Thus,

we can identify Ni0 andNi as the fraction of the time that servers i0 and

i, respectively, are busy When server i0 is busy, the average service time

(processing the overhead for useri) is Ai; and as the overhead period for

useri recurs, on average, every L seconds, it follows that the mean arrival

rate of virtual useri0is1/L Little’s theorem applied to server i0alone hence

yieldsNi0=Ai/L Likewise, server i expends an average amount of time Xi

in service of user i whose customers (packets) arrive at a rate λi Little’s

theorem again yieldsNi=λiXi It follows that

1 =mXi=1

Ai

L +

mXi=1

λiXi,

from which we readily obtain the expression

L =

Pmi=1Ai

1 −Pmi=1λiXifor the average cycle length

Example 7Time-sharing computer.

Consider a computing system where a central computer is accessed

by N terminals (users) in a time-sharing fashion Suppose that there is a

sustained demand for computing time so that a vacating user’s place is

promptly taken by a new user In such a system, it would be useful, from a

system administrator’s perspective, to obtain a realistic assessment of the

throughput, i.e., the average number of users served per second.

CPU N

1

λγ

Suppose that each user, after aperiod of reflection lasting R seconds onaverage, submits a job to the computerwhich, again on average, requires P sec-onds of cpu time Suppose T is the av-erage time spent by each user at a termi-nal for a given job, and let λ denote thethroughput achieved Then Little’s theo-rem applied to the queueing environmentcomprised of computer and N terminalsyieldsN = λT (as the system always has Ncustomers in it) Now, each client, on av-erage, will require at leastR + P seconds (if his job is taken up immediately

by the computer) and at mostR + NP seconds (if he has to wait till all other

jobs are processed) It follows that the average time spent by each customer

Trang 14

in the system may be bounded by2

The achievable throughput and delay regions are readily seen as a function

of the number of terminalsN in the following figure

A Little Probability Theory

Before we proceed to analyse the peccadilloes of individual queueing tems, let us detour through a quick review of some of the probabilistic con-cepts that will be needed

sys-Poisson and Exponential Statistics

The Poisson and exponential distributions are closely related and are tant in modelling arrivals and departures in many queueing environments

impor-in practice A quick review of (some of) the important features of thesedistributions is therefore in order

2 On a facetious note for the theoretical computer scientist: more evidence that P ≠ NP?

Trang 15

The Poisson Distribution SupposeX is a discrete random variable taking

values in the nonnegative integers{0, 1, 2, 3, } If, for some positive λ, the

distribution ofX is of the form

P{X = n} = e−λλn

n! (n≥ 0),

we say that X is a Poisson random variable with parameter λ, and write

simplyX∼ Po(λ) Poisson random variables occur in a wide variety of

situa-tions in analysis and practice Perhaps the best-known example of random

events obeying the Poisson law is that of radioactive decay: the number of

radioactive particles emitted by a radioactive substance and detected by a

detector over a given interval of timet follows a Poisson law to a very good

approximation Other instances include: the number of chromosome

inter-changes in organic cells under X-ray irradiation; the number of connections

to a wrong number; the distribution of flying bomb hits on London during

World War II; and last, but not least, the arrival of customers in many queues

in practice

It is easy to determine the moments of a Poisson random variable

To begin, observe that the Taylor series expansion foreλreadily yields the

following identities:

∞Xk=0

λkk! =e

λ,

∞Xk=0

kλkk! =λ

∞Xk=1

λk−1(k − 1)!=λe

λ,

∞Xk=0k(k − 1)λ

kk! =λ2

∞Xk=2

λk−2(k − 2)!=λ

2eλ

It follows readily that ifX is Poisson with parameter λ, then

E(X) =

∞Xk=0

ke−λλkk! =λ,and, likewise,

E(X2) =

∞X

k=0

k2e−λλkk! =

∞Xk=0k(k − 1)e−λλ

kk! +

∞Xk=0

ke−λλkk! =λ

2+λ

It follows immediately that

Var(X) = E(X2) −



E(X)2

Trang 16

as well To summarise: ifX∼ Po(λ) then

E(X) = Var(X) = λ.

Higher order moments can be determined analogously

A key property of Poisson random variables is that sums of pendent Poisson variables are also Poisson

inde-Key Property Suppose { Xi, i ≥ 1 } is a sequence of independent Poisson variables withXi∼ Po(λi) Then, for everyk, the partial sum Sk=Pk

i=1Xiis Poisson with parameterPk

i=1λi

Proof: The simplest proof is by induction The base case fork = 1 is mediate As induction hypothesis, supposeSk−1is Poisson with parameter

im-Pk−1i=1λi It is easy now to recursively specify the probability mass function

ofSk: we have

P{Sk=n} =

nXm=0

P{Sk−1=m, Xk=n − m} =

nXm=0

P{Sk−1=m} P{Xk=n − m}

=nXm=0

m!

!

e−λk λn−mk(n − m)!

!

n!

nXm=0

nm

!(λ1+· · · + λk−1)mλn−mk

the last step following via the binomial theorem Thus,Sk is also Poissonwith parameterPk

i=1λi This completes the induction

The Exponential Distribution SupposeX is a nonnegative random able If, for some positiveµ, the distribution function and probability den-sity function ofX are of the form

we say thatX is an exponentially distributed random variable with parameter

µ, and write simply X∼ Exp(µ) The various moments of an exponentially

Trang 17

distributed random variable are easy to compute via successive integration

by parts In particular, ifX∼ Exp(µ), then

The exponential distribution arises naturally in many contexts

wher-ever a random quantity under consideration has a “memoryless” character

Formally, a random variableX is said to exhibit the memoryless property if,

for every pair of positive real numbersr and s,

P {X > r + s | X > r} = P{X > s}. (†)For instance, it is appropriate in many queueing settings to model customer

service times as being memoryless in the sense that the additional time

re-quired to finish servicing the customer is independent of the amount of

service that he has already received Another example where a

memory-less property may be evidenced in a queueing setting is the times between

customer arrivals; in many applications, these inter-arrival times may be

modelled as memoryless in that the additional time that has to elapse

be-fore the next arrival is independent of how much time has already elapsed

since the previous arrival

Key Property A random variable X has the memoryless property if, and

only if, it is exponentially distributed.

Proof: WriteG(r) = 1 − F(r) for the right tail of the distribution function

ofX (For simplicity, we dispense with the subscript X as there is no danger

of confusion here.) Then, by definition of conditional probability,

Trang 18

while, the right-hand side of (†) is just

P{X > s} = G(s)

Accordingly,X has the memoryless property if, and only if, it’s distributionsatisfies

for everyr > 0 and s > 0

IfX∼ Exp(µ) then G(r) = 1 − F(r) = e−µrfor everyr > 0 Then

G(r + s) = e−µ(r+s)=e−µre−µs=G(r)G(s),for every r > 0 and s > 0 It follows that any exponentially distributedrandom variable has the memoryless property More generally, it followsfrom a basic fact from analysis that the only distribution satisfying (†) isthe exponential distribution.3

The exponential and Poisson distributions are intimately related, as

we will see shortly

Poisson Random Processes In order to be able to characterise the steadystate of a queueing environment it is important to statistically characterisethe time evolution of the total number of arrivals into the systemA(t) andthe total number of departures from the systemD(t) These are both exam-ples of counting processes characterised by the occurrence of elementaryevents—arrivals in the case of the arrival process A(t) and departures inthe case of the departure processD(t) Formally speaking, a counting pro- cess { X(t), t ≥ 0 } is a nonnegative, integer valued random process whose

sample functions are nondecreasing and for which, for every pair of timeinstantss and t with s < t, X(t) − X(s) denotes the number of occurrences of

an elementary event (arrivals or departures) in the time interval (s, t] It isclear that the sample functions of a counting process have a step-like char-acter, increasing only at points of jump where the function value increases

by an integral amount In many cases in practice, arrivals or departures may

be modelled as being independent over disjoint time intervals; furthermore,

it is frequently possible to model the number of arrivals or departures thatoccur in an observation interval as being determined statistically only bythe length of the interval and not on where the interval is positioned Theformalisation of these notions leads to the definition of independent andstationary increments, respectively

3The reader may wish to try her hand at proving the following generalisation: if G(r) is a

solution of ( ††) defined for r > 0 and bounded in some interval, then either G(r) = 0 for all

r, or else G(r) = e−µrfor some constant µ.

Trang 19

We say that a counting processX(t) has independent increments if

the number of elementary events occurring in disjoint time intervals are

independent Formally, we require that for every positive integer k, and

every collection of time instants s0 < s1 < s2 < · · · < sk, the random

variablesXi=X(si) −X(si−1) (1≤ i ≤ k) are independent.

We say that a counting processX(t) has stationary increments if the

distribution of the number of elementary events that occur in any

observa-tion interval depends only on the length of the interval Formally, we require

that the probability mass function P{X(t + s) − X(s) = n} (n ≥ 0) depends

only on the interval lengtht and not on s

The most important counting process for our purposes is the

Pois-son process A random process{ X(t), t ≥ 0 } is said to be a Poisson process

with rateλ > 0 if:

❶ X(0) = 0

❷ X(t) has independent increments

❸ The number of elementary events that occur in any interval of length

t is Poisson distributed with parameter λt; i.e.,

P©

X(t + s) − X(s) = nª

= e−λt[λt]nn! (n = 0, 1, 2, )for everyt and s In other words, X(t + s) − X(s)∼ Po(λt) for every t

ands

It is not hard to demonstrate that a Poisson process satisfies the following

properties:

① Almost all sample functions of the process are monotone,

nondecreas-ing, increasing only in isolated jumps of unit magnitude

Indeed, consider the number of elementary events that occur in any interval

of durationδ

The points where these jumps occur may be identified with the

times when some sort of random event occurs, such as, for instance, the

arrival of a customer to a queue The occurrence of an elementary event is

called an arrival With this interpretation then,X(t + s) − X(s) stands for the

number of arrivals between the timess and t + s Observe that λ can now be

identified as simply the expected rate of arrivals

② The number of arrivals over disjoint time intervals are independent

More precisely, suppose { sk, k ≥ 0} is a strictly increasing sequence of

points in time and write Xk = X(sk) −X(sk−1) for the number of arrivals

Trang 20

in the time interval (sk−1, sk] Then{ Xk, k ≥ 1} is a sequence of dent, Poisson random variables, with

indepen-Xk∼ Poλ(sk−sk−1)

(k≥ 1).

It is clear that the sample paths of a Poisson process are completelycharacterised by the arrival times of the elementary events that comprisethe process Accordingly, sett0= 0 and, for each n≥ 1, let tn denote thetime of thenth arrival The interarrival time τn =tn−tn−1 then denotesthe time between the (n − 1)st and nth arrivals The Poisson character ofthe process immediately implies that the interarrival times{ τn, n≥ 1} form

an independent, identically distributed sequence of random variables Themarginal distribution and density functions of the random variablesτn arenow readily determined:

Fτn(s) = P{τn ≤ s} = 1 − P{no arrival in duration s} =

pa-The fundamental link between the Poisson and exponential butions is manifested here: a Poisson process with rateλ has independentinterarrival times sharing a common exponential distribution with parame-terλ

distri-Markov Chains

Consider a sequence of random variables{ Nk, k≥ 0} where each randomvariable takes on values only in the discrete set of points{0, 1, 2, } Therandom sequence{Nk} forms a Markov chain if, for every k, and every choice

of nonnegative integersm0,m1, , mk−1,mk=m and mk+1=n,

P{Nk+1=n| Nk=m, Nk−1=mk−1, , N0=m0}

= P{Nk+1=n| Nk=m} = Pmn

In words: in a Markov chain, given a sequence of outcomes of random trials,the outcome of the next trial depends solely on the outcome of the immedi-ately preceding trial

Transition Probabilities The set of values{0, 1, 2, } that can be taken

by the random variablesNk is called the set of possible states of the chain;

Trang 21

in this terminology, the conditional probability Pmn has the evocative

geo-metric interpretation of connoting the probability of a transition from state

m to state n.4 Higher-order transition probabilities for the chain can now be

determined recursively Write Pmn(l) for the probability of a transition from

statem to state n in exactly l steps, i.e.,

P(l)mn= P{Nk+l=n| Nk=m}

Observe that this is just the probability of all paths

(m = mk, mk+1, , mk+l=n)starting at statem and ending at state n Thus, P(1)mn=Pmnand

Pmn(2) =XiPmiPin

Induction onl gives the general recursion formula

P(l+1)mn =

XiPmiPin(l) (l≥ 1),

while a further induction onl leads to the Chapman-Kolmogorov identity

Pmn(l+k)=

Xi

P(l)miPin(k)

The higher-order transition probabilities can hence be recursively built up

from the basic transition probabilitiesPmn

The probabilities of first transition can be built up analogously Write

F(l)mn for the probability that for a process starting from state m, the first

transition into staten occurs in the lth step Then

Fmn=

∞Xl=1

F(l)mn

denotes the probability that, starting from statem, the system will ever pass

through staten The sequence©

F(l)nn, l≥ 1ª

represents the distribution of

the recurrence times for staten If a return to state n is certain, i.e., Fnn=1,

then the quantity

µi=

∞Xl=1

lF(l)nn

4 More formally, our usage of the term Markov chain should be qualified by adding the

clause “with stationary transition probabilities” to make clear that the transition probabilities

Pmndo not depend upon the step (or epoch)k We will not need to have recourse to the

more general Markov property where transitions are epoch-dependent.

Trang 22

is meaningful (it may be infinite) and represents the mean recurrence time for staten.

Classification of States Starting from a given state n, the successivereturns to state n constitute what is known in the theory of probability

as a recurrent event An examination of these recurrent events leads to a

classification of states

➀ The returns to a given state n may be either periodic or aperiodic.More formally, a staten has period t > 1 if P(l)nn=0 unless l = νt is amultiple oft and t is the largest integer with this property A state n

is aperiodic if no sucht > 1 exists

➁ States may also be classified depending upon whether a return to thestate is certain or not In particular, we say that a staten is persistent

ifFnn=1 and transient if Fnn< 1

➂ States which are both aperiodic and persistent may be statisticallycharacterised by the frequency of return over a sufficiently long timeperiod provided the mean recurrence time for the state is finite Moreparticularly, an aperiodic, persistent state n with µn < ∞ is called

ergodic.

Chains can be further characterised by a consideration of the

prob-abilities of eventual transition from one state to another We say that state

n can be reached from state m if there is a positive probability of transiting

from statem to state n in one or more steps, i.e., there exists some l≥ 0

such thatP(l)mn> 0

➃ A Markov chain is said to be irreducible if every state can be reached

from every other state

Clearly, in an irreducible chain, there is a positive probability of an eventualreturn to any given state

In the sequel we will only be concerned with irreducible chains all

of whose states are ergodic

Stationary Distributions Irreducible, ergodic chains settle down intolong-term, statistically regular behaviour The following result, which wepresent without proof, is the principal result

Key Property In an irreducible chain all of whose states are ergodic, the limits

πn= lim

l →∞P(l) mn

Trang 23

exist and are independent of the initial state m In addition, the sequence{πn}

forms a discrete probability distribution on the set of states and satisfies

0 < πn=

Xm

for every n.

To foster a better appreciation of this elegant result, consider the

evolution of the states of the chain starting from an initial state X0drawn

from an a priori distribution©

π(0)m Pmn.(k)

In view of the preceding theorem, it follows directly that

π(k)n → πn (k→ ∞),

so that the limiting distribution of states tends to{πn} whatever be the initial

distribution of states Consequently, after a sufficiently long period of time,

the distribution of states will be approximately invariant or stationary In

particular, it is easy to see from (?) that if the initial distribution satisfies

π(0)n = πn then it will perpetuate itself for all time In consequence, the

sequence{πn} satisfying (?) is called the stationary distribution5of the chain

The stationary distribution may be identified naturally with the

fre-quency of visits to the states In particular, supposeVn(k) denotes the

num-ber of visits to staten in k steps If we define the indicator random variables

ξ(l)n

5Also called the invariant distribution.

Trang 24

The relative frequency of visits to state n is given by ν(k)n = Vn(k)

k Itfollows easily that the expected fraction of time spent in staten satisfies

E ν(k)n = 1

k

kXl=1

E ξ(l)n = 1

k

kXl=1

π(l)n → πn

as k → ∞, and indeed it can be shown that, with probability one, ν(k)

n ,the relative frequency of visits to state n, converges to πn, the stationaryprobability of staten In others words, πn may be very agreeably identifiedwith the fraction of time the chain spends in staten

Balance Equations AsP

nPmn=1 (why?), we can rewrite (?) in the form

XnπmPmn=

Xn

Observe vide our earlier remarks thatπmPmn may be identified as the ative frequency of transition from state m to state n so that the sum onthe left-hand-side,P

rel-nπmPmn, may be interpreted as the frequency of sitions out of state m; likewise, πnPnm may be identified as the relativefrequency of transition from staten to state m so that the sum on the right-hand-side, P

tran-nπnPnm, may be interpreted as the frequency of transitionsinto state m Thus, at equilibrium, the frequency of transitions into any given state must equal the frequency of transitions out of that state.

More generally, letS be any subset of states Summing both sides

of (??) over all statesm∈ S, we obtain

X

m∈S

∞Xn=0πmPmn=

X

m∈S

∞Xn=0πnPnm

Breaking up the inner sum overn on both sides results inX



=X

,

which, after cancelling common terms on both sides, results in the following

appealing generalisation of (??):

X

m∈S

Xn∉S

πmPmn=

X

m∈S

Xn∉S

πnPnm (? ? ?)

We may interpret our finding as a generalisation of our previous

observa-tion: at equilibrium, the frequency of transitions into any given set of states

Trang 25

S must equal the frequency of transitions out of the set of states S If we

imagine a membrane surrounding states in S and excluding states not in

S, then (? ? ?) may be viewed as a mathematical codification of a

conserva-tion law for probability flow: at equilibrium, the net flow of probability mass

across the membrane is zero.

The balance equations (? ? ?) may frequently be used to good

ac-count to determine the underlying stationary distribution as we see in the

following

Example 8 Birth-Death Processes.

A Markov chain in whichPmn=0 if|m−n| > 1 is called a birth-death

process In such a chain, transitions are permitted only to neighbouring

states The stationary probability distributions are very easy to determine

from the balance equations for such chains The whole game here is in a

proper choice of the set of states S Consider S = {0, 1 , n − 1}, i.e., the

membrane encloses the first n states In this case, observe that the only

transition out of the membrane is from staten − 1 to state n, and, likewise,

the only transition into the membrane is from state n to state n − 1 The

balance equations (? ? ?) hence yield

Pi−1,iPi,i−1.

We can determine π0 directly from the identity P∞

n=0πn = 1, whence wefinally obtain

πn =nYi=1

Pi−1,iPi,i−1

, X∞

m=0

mYi=1

Pi−1,iPi,i−1 (n≥ 0).

In particular, ifPn−1,n=p and Pn,n−1=q for all n≥ 1, then writing ρ = p/q,

we obtainπn=ρn(1 − ρ) for all n≥ 0.6

Memoryless Arrivals and Service

The best understood classes of queueing systems are those where arrivals

or departures or both have a memoryless character inherited from the

ex-ponential distribution Fortunately, this class of systems has wide practical

utility

6 As we will see shortly, the factor ρ may be identified as the utilisation factor of an

M/M/1 system.

Trang 26

Poisson Arrivals As before, A(t) denotes the number of arrivals in thetime interval [0, t] We say that the system has memoryless arrivals if A(t)

is a Poisson process with rate, say, λ The reason for the nomenclaturewill soon become apparent It follows, in consequence, that the number

of arrivals over disjoint time intervals are independent In particular, if

diately implies that the interarrival times{ τn, n≥ 1} form an independent,identically distributed sequence of random variables with common expo-nential distribution with parameterλ:

Fτn(s) = P{τn ≤ s} = 1 − P{no arrival in duration s} =

Z∞

0

λr2e−λrdr = λ22,Var(τn) = E(τ2n) −

We may interpret the above result as saying that the additional time needed

to wait for an arrival is independent of past history Alternatively, Poisson arrivals are memoryless.

Occupancy Distribution We are interested in characterising the term behaviour of statistically stable queueing systems, i.e., systems forwhich the occupancy N(t) in the system tends to some statistical steadystate captured by a stationary probability distribution In such systems,

Trang 27

long-we typically would like to determine the limiting stationary distribution of

system occupancy,

πn= lim

t →∞P

©N(t) = nª

at an arbitrary point in timet in the steady state, i.e., a point in time

spec-ified independently of the state of the system It is also of interest to

sta-tistically characterise occupancy in the steady state as seen by an arriving

customer, or by a departing customer Thus, the quantities

πan= lim

t →∞P

©N(t−) = n| an arrival at tª,

πdn= limt→∞P

©N(t+) = n| a departure at tª,which connote the steady state probability that there are n customers in

the system immediately prior to an arrival, and immediately following a

departure, respectively, are of both analytical and practical interest

In general, the state of the system at an arrival or departure may be

atypical so that the quantitiesπn,πa

n, andπd

nmay be all different However,for most single queue systems of interest, the state of affairs immediately

prior to an arrival is statistically indistinguishable from the state of affairs

immediate following a departure Indeed, suppose that the arrival and

ser-vice statistics are such that the system reaches a statistical steady state

and, furthermore, for everyn, there is a positive probability that the steady

state occupancy isn Suppose also that the occupancy process N(t) changes

(with probability one) in unit increments only so that there can be at most

one arrival or departure at a given epoch Then almost all sample paths of

the occupancy processN(t) pass through any given value of n to a

neigh-bouring value infinitely often so that, in particular, for every increase in

occupancy fromn to n + 1 there must be a corresponding later decrease in

occupancy fromn + 1 to n Thus, on almost all sample paths, the frequency

of transitions fromn to n + 1 must equal the frequency of transitions from

n + 1 to n, and these, by the ergodic theorem, must be equal (with

probabil-ity one) to the probabilitiesπa

n andπd

n, respectively Thus, in all the settings

of interest to us, we may suppose that πa

n = πd

n for every n, i.e., in thesteady state the state of affairs seen by an arriving customer is statistically

indistinguishable from that seen by a departing customer

Arrival and departure epochs can, however, be atypical Consider,

for instance, a situation where a packet arrives every second for

transmis-sion over a communication link Suppose that every packet is comprised of

1200 bits and the line transmits at the rate of 4800 bits per second Then

each packet requires a transmission time of1200/4800 = 0.25 seconds (Of

course, this is a D/D/1 system.) Any arriving packet or departing packet

will see no packets in the system and, in particular, the expected number

Trang 28

of packets seen in the system by an arriving or a departing packet is zero.However, these arrival and departure epochs are atypical An external ob-server looking at the system at a random time independent of the state

of the system will see an ongoing packet transmission (with no packets inqueue) a fraction0.25 of the time so that the steady state mean occupancy

of the system seen by the (typical) observer is0.25

Randomisation does not help in the above example For instance,suppose packet transmission times are uniformly distributed between 0.2and 0.3 seconds and packet interarrival times are independent and uni-formly distributed between0.9 and 1.1 seconds The above conclusions willcontinue to hold unabated

Under what circumstances are arrival and departure epochs cal?” Focus on an arrival epoch for definiteness and writeπa

“typi-n(t−) for theprobability that a customer arriving at epocht sees n customers in the sys-tem:

πan(t−) = P©

N(t−) = n| an arrival at tª.Letδ be an arbitrarily small quantity and let A[t, t + δ) denote the event thatthere is a single arrival in the interval [t, t + δ) It follows that

πa

n(t−) = lim

δ →0P

©N(t−) = n| A[t, t + δ)ª= lim

δ →0

P©N(t−) = n, A[t, t + δ)ª

P©A[t, t + δ)ª

= lim

δ →0

P©A[t, t + δ)| N(t−) = nªP©

N(t−) = nª

P©A[t, t + δ)ª

If, for every t and δ > 0, the number of arrivals in the interval [t, t + δ)

is independent of the system occupancy at t−, then the events A[t, t + δ)and ©

N(t−) = nª

are independent This is the Poisson hypothesis which

is satisfied when the arrival process is Poisson and the service times arepositive, independent of the interarrival times, and conform to a generaldistribution Under these conditions,

P©A[t, t + δ)| N(t−) = nª= P©

= P©N(t−) = nª

=πn(t−),

whereπn(t−) is the unconditional occupancy distribution at the arbitrarytimet− As t → ∞, the left-hand side tends to the stationary occupancydistribution seen by an arriving customer, while the right-hand side tends tothe stationary occupancy distribution at a typical (arbitrary) time Thus, inthe steady state, arrival and departure epochs are typical under the Poisson

Trang 29

hypothesis In other words: If the arrival process is Poisson and the service

times are positive and independent of the interarrival times but otherwise

conform to an arbitrary distribution, then, in the steady state, the occupancy

distribution seen by an arriving customer or a departing customer is identical

to that seen by an observer looking at the system at an arbitrary time.

Exponential Service For any given server, let the random variable Xn

denote the time in service of the server’s nth customer We say that the

service is memoryless if the sequence of random variables { Xn, n ≥ 1} is

independent and identically distributed with common exponential

distribu-tion Xn ∼ Exp(µ) for some positive µ In particular, the distribution and

density functions ofXntake the form

which we may interpret as saying that the additional time required to finish

servicing a customer is independent of when service was begun.

M/M Systems

InM/M queueing systems, both customer arrival and departure processes,

A(t) and D(t), respectively, have a memoryless character inherited from the

exponential distribution Suppose that the arrival processA(t) is a Poisson

process with rate λ Let Xn denote the service time of the nth customer

and suppose thatXnis exponentially distributed with parameterµ We will

suppose that the sequence of service times{ Xn, n≥ 1} is independent and

identically distributed, with common distribution Exp(µ), and independent

of the interarrival times

Let the counting process N(t) = A(t) − D(t) denote the

instanta-neous number of customers in the M/M queueing system Fix a tiny δ > 0

and consider a discrete-time sequence of snapshots of the counting process,

Nk=N(kδ) (k = 1, 2, )

Trang 30

Intuition suggests that, for a small enough choice of δ, the sequence ofsnapshots {Nk} will give a sufficiently detailed picture of the evolution ofcustomers in the queueing system.

Consider the kth sub-interval in time Ik =

(k − 1)δ, kδi

Let AkandDkdenote the number of arrivals and departures, respectively, during

Ik Observe that Ak ∼ Po(λδ) is a Poisson random variable; the number ofdepartures Dk in the time interval Ik is governed by the exponential dis-tribution of service times and the number of customers in the system Tosimplify notation, write

Q(a, d| m) = P{Ak=a, Dk=d| Nk−1=m}for the conditional probability that there area arrivals and d departures inthekth sub-interval given that there were m customers in the system at thestart of the sub-interval

Now observe that the sequence{ Nk, k≥ 1} clearly forms a Markovchain IfPmndenote the transition probabilities for this chain, then

P{Nk+1=n| Nk=m, Nk−1=mk−1, , N1=m1}

= P{Nk+1=n| Nk=m} = Pmn.The transition probabilities Pmn = P{Nk = m | Nk−1 = n} of the chainare readily seen to be determined by the various conditional probabilitiesQ(a, d| m) Indeed, a transition from state m to state n occurs if, and only

if,m+Ak−Dk=n, i.e., the net increase Ak−Dkin the number of customers

in the given sub-intervalIkis exactlyn − m Thus,

(l)

mn= limk→∞P{Nk=n} (n≥ 0).

Our goal is to determine the transition probabilities Pmn of thechain under various M/M circumstances and thence to determine the sta-tionary probabilitiesπn It will then be possible to determine the expectednumber of customers in the system:

N =

∞Xn=0nπn

Little’s theorem can now be grandly deployed to determine the average lay per customerT = N/λ in the steady state

Trang 31

de-M/M/1 Queues: The Single Server System

We begin a consideration of memoryless systems with the simplest scenario

of a single server queue where both arrivals and service are memoryless

Transition Probabilities Exploiting the independence of inter-arrival

times and service times, we can quickly write down expressions for the

state-conditional arrival and departure probabilities in each sub-interval

The key is to obtain expressions for the probabilitiesQ(0, 0| m), Q(1, 0 | m),

andQ(0, 1| m) that, given m ≥ 0 customers in the system at the start of

sub-intervalIk, during the sub-interval there were no arrivals and no departures,

there was a single arrival and no departure, and there were no arrivals and a

single departure, respectively We anticipate that these situations will

cap-ture most of the probability for sub-intervals of sufficiently small duration

δ

In the sequel, we will need to make asymptotic estimates of various

quantities such as e−λδ, e−µδ, and λδe−λδ for asymptotically small δ → 0

and it will be convenient to collect the needed estimates here Recall that

the Taylor series approximation to the exponential yields

ex=1 + x + o(x) (x→ 0)where o(x) denotes some function, say f(x), of x which is asymptotically

small compared to x, i.e., f(x)/x → 0 as x → 0 Collecting estimates upto

terms of order o(δ), we hence obtain

e−λδ=1 − λδ + o(δ), and e−µδ=1 − µδ + o(δ),

e−λδe−µδ=1 − λδ − µδ + o(δ),

e−λδ−e−µδ= (µ − λ)δ + o(δ),

(1)

asδ→ 0

➀ No arrivals, no departures Begin with the simplest case when there

are no arrivals and no departures in a given sub-intervalIk A simple

conditioning argument yields

The penultimate step follows as the probability of no arrivals in the

intervalIk is independent of the number of customers already in the

Trang 32

system and, conditioned on Ak = 0, the number of departures pends only on whether there exist any customers already in the sys-tem To justify the final step, first observe that Ak ∼ Po(λδ) so that

de-P{Ak = 0} = e−λδ, whereas with probability one there are no tures inIk if there are no customers in the system and no arrivals in

depar-Ik, whence P{Dk = 0 | Nk−1 = 0} = 1, while the probability that anexisting customer in service at the beginning of the interval does notdepart is just the probability that at least an additional µδ units ofservice are required which, by the memoryless property of the expo-nential distribution, yields P{Dk = 0 | Nk−1 = m} = e−µδ form≥ 1.

Applying the asymptotic estimates (1) we hence obtain

➁ Single arrival, no departures Now consider the probability that there

is a single arrival and no departure in the sub-interval Ik given thatthere arem customers already in the system:

Q(1, 0| m) = P{Ak=1, Dk=0| Nk−1=m}

This is easy to evaluate form≥ 1 when there is a customer in service

(and possibly other customers in queue) at the start of sub-intervalIk

In this caseQ(1, 0| m) = P{Dk=0| Ak=1, Nk−1=m} P{Ak=1| Nk−1=m}

= P{Dk=0| Nk−1=m} P{Ak=1}

=e−µδλδe−λδ (m≥ 1),

by a similar argument to the one invoked above: the number of arrivals

is independent of the number of customers already in service, whence

P{Ak=1| Nk−1=m} = P{Ak=1} = λδe−λδ

by virtue of Poisson arrival statistics; and the event that a customer

in service does not depart (i.e., requires anotherµδ units of service) isindependent of the number of arriving customers, whence

P{Dk=0| Ak=1, Nk−1=m} = P{Dk=0| Nk−1=m}

= P{Dk=0| Nk−1=1} = e−µδfor m ≥ 1 by virtue of the memoryless property of the exponential

distribution

Trang 33

Things get slightly more complex if there are no customers in the

sys-tem at the start of the sub-intervalIk In this case, there will be a total

of one arrival and no departures inIkprovided there is a single arrival

who doesn’t depart before the end of the sub-interval This suggests

that we take a closer look at the time of arrival of the single customer

Suppose a customer arrives at some time (k−1)δ < t≤ kδ, finds no

cus-tomers in the system, and promptly goes into service The probability

that he hasn’t departed before the end of the interval is e−µ(kδ−t),

while the probability that no additional customers arrive before the

end of the interval ise−λ(kδ−t) As service is independent of arrivals,

it follows that, conditioned on an arrival att ∈ Ik into an empty

sys-tem, the probability that there are no further arrivals or departures

before the end of the subinterval is given bye−µ(kδ−t)e−λ(kδ−t)

Re-move the conditioning over the time of arrival by averaging out over

the (exponential) distribution of interarrival times to obtain,

as the (Poisson) arrivals are independent of the number of customers

in the system The remaining term may be determined by considering

the cases m = 1 and m ≥ 2 separately Suppose there is a single

customer in service in the system Then the probability that he departs

in the intervalIkis just1 − e−µδ If there is more than one customer in

the system, then the probability that the customer in service departs

Trang 34

and that the customer who takes his place does not withinIk is justµδe−µδ Consequently,

P{Dk=1| Ak=0, Nk−1=m} =

±

1 − e−µδ ifm = 1,µδe−µδ ifm≥ 2.

It follows that

Q(0, 1| m) =

±(1 − e−µδ)e−λδ ifm = 1,µδe−µδe−λδ ifm≥ 2.

Both cases reduce to the common asymptotic order estimate

Q(0, 1| m) = µδ + o(δ) (δ→ 0) (†††)

via the Taylor expansions (1)

➃ Multiple arrivals and departures A consideration of the estimates (†,

††, †††) shows that

P{Ak+Dk≥ 2| Nk−1=m}

=1 − Q(0, 0| m) − Q(1, 0 | m) − Q(0, 1 | m) = o(δ) (δ→ 0) (††††)

for everym≥ 0 In words: with probability 1 − o(δ), in any sub-interval

Ikof durationδ there is either no arrival and no departure, or there is asingle arrival and no departure, or there is are no arrivals and a singledeparture; alternatively, the probability that there are multiple arrivalsand departures (Ak+Dk ≥ 2) in any sub-interval Ik is asymptoticallysmall compared toδ

Now recall thatPmn = P{Nk =n| Nk−1= m} denotes the probability that,starting with m customers at the end of sub-interval Ik−1, at the end ofsub-intervalIk there aren customers in the system In consequence of theestimates (†, ††, †††, ††††), we directly obtain the asymptotic estimates

Pnn=

±Q(0, 0| 0) + o(δ) = 1 − λδ + o(δ) ifn = 0,Q(0, 0| n) + o(δ) = 1 − λδ − µδ + o(δ) if n ≥ 1,

Pn−1,n=Q(1, 0| n − 1) + o(δ) = λδ + o(δ),Pn,n−1=Q(0, 1| n) + o(δ) = µδ + o(δ),and, finally, when|m − n| ≥ 2,

Pmn=

X(a,d):a−d=n−m

Q(a, d| m) = o(δ)

Trang 35

It is clear then that, up to asymptotically small order terms as the interval

duration δ → 0, the evolution of the number of customers in the system,

{ Nk, k≥ 1}, is a birth-death process with uniform birth probabilities λδ and

uniform death probabilitiesµδ

Stationary Probabilities As we saw in Example 8, the balance

equa-tions applied to the set of statesS ={0, 1, , n − 1} immediately yields the

recurrence relation

πn = λµπn−1 (n≥ 1)

for the stationary probabilitiesπn = limk→∞P{Nk =n} of the chain Write

ρÕ λ/µ Induction now directly yields

πn =ρnπ0 (n≥ 0).

It remains to determine the probability π0 that there are no

cus-tomers in the system at an arbitrary epoch in the steady state As{πn} is a

discrete probability distribution, it follows that

1 =

∞Xn=0

πn=π0

∞Xn=0

ρn= π0

1 − ρ.

It follows thatπ0=1 − ρ

Observe that for the geometric series in the penultimate step to

con-verge it is necessary thatρ < 1 whence it is requisite that λ < µ Recall that

1/λ is the mean time between arrivals and 1/µ is the mean service time

If the queue of customers waiting to be served is to remain bounded we

would anticipate that1/µ < 1/λ Equivalently, the service rate has to exceed

the arrival rate if the system is to be stable—an intuitively satisfying result.

An alternative derivation of π0 is instructive Consider the stable

queueing environment consisting only of the service part of theM/M/1

sys-tem excluding the queue The arrival rate into the service part of the syssys-tem

is againλ; (indeed, for equilibrium, the arrival and departure rates from any

subpart of the system must be equal toλ, the arrival rate of customers into

the system) The mean delay seen by a customer in this part of the system

is exactly the mean service time1/µ Thus, by Little’s theorem, the average

number of customers in service at an arbitrary instant in the steady state

isρ = λ/µ It follows that we may identify ρ as exactly the fraction of time

that the server is busy in the steady state and, by the ergodic theorem, this

is (with probability one) exactly the probability that there is at least one

customer in the system, whence the server is busy Thus, ρ = 1 − π0, as

derived earlier It follows that we may identifyρ as the utilisation factor of

the system—the fraction of time the server is busy—; alternatively,1 − ρ is

the fraction of time that the server is idle in the steady state

Trang 36

Finally, putting everything together, we obtain the stationary bution of the number of customers in theM/M/1 system:

N = limt→∞E

N(t)

= lim

k →∞E Nk=

∞Xn=0

nπn =

∞Xn=0n(1 − ρ)ρn

= (1 − ρ)ρ

∞Xn=0

nρn−1

The sum on the right-hand side is readily identified as just the derivative of

a geometric series:

∞Xn=0

nρn=

∞Xn=0

d

dρρ

n = ddρ

∞Xn=0

ρn= ddρ

1

1 − ρ =

1(1 − ρ)2

Of course, we have seen this earlier when considering the throughput of

a stop-and-wait ARQ protocol—this is just the mean of a geometric bution with parameterρ Observe how, in accordance with intuition, theexpected number of customers in the steady state grows without bound asthe utilisation factor approaches one The average time spent by each cus-tomer in the system in the steady state is now readily obtained via Little’stheorem, applied to the entireM/M/1 system, as

distri-T = N

λ =

1

µ − λ.WriteWQfor the average time spent by each customer waiting in queue andadopt the nonce notation WS for the average service time By linearity of

Trang 37

expectation,T = WQ+WS Now recall that service times are exponentially

distributed with parameterµ Consequently, WS=1/µ It follows that

obtain the expression

NQ=λW = ρ

2

1 − ρfor the average population in the queue

An evocative alternative expression relating the quantities N and

NQ can be obtained directly by a consideration of the utilisation factorρ

The probability that a customer has to wait in queue for service in the steady

state is given by

PQ=1 − π0=ρ

As expected, the queueing probability is given by the utilisation factor—the

fraction of time the server is busy Likewise, the probability that an arriving

customer finds the server free to begin service is given by

PS=π0=1 − ρ,i.e.,PSis just the fraction of time that a server is not being utilised It follows

that the average number of customers waiting in the queue in steady state

is given by

NQ=PQN = PQ ρ

while the average number of customers in service, or equivalently, the

frac-tion of time that the server is busy, is given by

NS=PSN = PS ρ

1 − ρ =ρ,

as is to be expected

Alternatively, we can derive the expression forNQdirectly from the

stationary probabilities Observe that when there are n customers in the

system, then precisely one is in service with the remainingn − 1 customers

Ngày đăng: 22/07/2014, 18:22

TỪ KHÓA LIÊN QUAN