1. Trang chủ
  2. » Kỹ Thuật - Công Nghệ

Tài liệu Queueing mạng lưới và chuỗi Markov P8 docx

68 261 0
Tài liệu đã được kiểm tra trùng lặp

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Tiêu đề Algorithms for Product-Form Networks
Tác giả Gunter Bolch, Stefan Greiner, Hermann de Meer, Kishor S. Trivedi
Trường học John Wiley & Sons, Inc.
Chuyên ngành Queueing Networks and Markov Chains
Thể loại Sách tham khảo
Năm xuất bản 1998
Định dạng
Số trang 68
Dung lượng 3,75 MB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

7.80, the state probabilities of a closed single class product-form queueing network with N nodes can be expressed in the following way: where the functions I?i Ici are defined in Eq.. B

Trang 1

of jobs, primarily because the formula makes a pass through all the states of the underlying CTMC Therefore we need to develop efficient algorithms to reduce the computation time [Buze71]

Many efficient algorithms for calculating performance measures of closed product-form queueing networks have been developed The most impor- tant ones are the convolution algorithm and the mean value analysis (MVA) [ReLa80] The convolution algorithm is an efficient iterative technique for calculating the normalization constant, which is needed when all performance measures are computed using a set of simple equations In contrast, the mean value analysis is an iterative technique where the mean values of the perfor- mance measures can be computed directly without computing the normaliza- tion constant We also introduce the RECAL algorithm [CoGe86] (recursion

by chain algorithm), which is well suited for networks with many job classes The fourth and last method for analyzing product-form networks presented

in detail is the so-called flow-equivalent server method [CHW75b, ABP85] This method is well suited when we are especially interested in computing the performance of a single station or a part of the network

There are several other algorithms that we do not cover in detail due to space limitations Based on the mean value analysis, [ChSa80] and [SaCh81] developed an algorithm called LBANC (local balance algorithm for normal-

311

Gunter Bolch, Stefan Greiner, Hermann de Meer, Kishor S Trivedi

Copyright  1998 John Wiley & Sons, Inc Print ISBN 0-471-19366-6 Online ISBN 0-471-20058-1

Trang 2

Fig, 8.1 Flowchart describing sections of Chapters 8 and 9

izing constants) This algorithm iteratively computes the normalization con- stant and the performance measures It is very well suited for networks with

a small number of nodes but a large number of jobs The CCNC algorithm (coalesce computation of normalizing constants) [ChSa80] is especially used when storage space is limited That the convolution algorithm, the mean value analysis, and LBANC can be derived from each other has been shown

by [Lam81] The DAC algorithm (distribution analysis by chain) [SoLa89] is used to directly compute the state probabilities of a product-form queueing network A generalization of the MVA to higher moments is introduced in [Stre86] Other algorithms for determining the normalization constant are the algorithms of [Koba78] and [Koba79] ( w ic are based on the enumeration of h h polynomials of Polya), and the partial fraction method of [Moor72]

For very large networks, execution time of the preceding algorithms is not acceptable Many approximation algorithms have been developed for this purpose Exact and approximate algorithms are discussed in this chapter and

in Chapter 9 and are summarized in Fig 8.1

Trang 3

8.1 THE CONVOLUTION ALGORITHM

The convolution algorithm was one of the first efficient algorithms for ana- lyzing closed product-form queueing networks and is still in use today The name of this technique reflects the method of determining the normalization constant G(K) from the functions Fi(Ici), which is similar to the convolu- tion of two probability mass functions Once the normalization constant is computed, the system performance measures of interest can be easily derived [Buze71] We recall that the convolution operator @ is defined in the follow- ing way: Let A, B, C be vectors of length K + 1 Then the convolution

According to the BCMP theorem, Eq (7.80), the state probabilities of a closed single class product-form queueing network with N nodes can be expressed

in the following way:

where the functions I?i (Ici) are defined in Eq (7.61), and:

is the normalization constant of the network

The computation of G(K) is carried out by iterating over the number of nodes in the network and over the number of possible jobs at each node For this purpose the following auxiliary functions G, (Ic) , n = 1, , N and /k = 0, ) K are defined as follows:

The desired normalization constant is then:

Trang 4

From Eq (8.2) for n > 1, it follows that:

For n = 1 we have:

The initial condition is of the form:

The convolution method for computing the normalization constant G(K)

is fully determined by Eqs (8.4) to (8.6) The (K + 1)-dimensional vectors:

are therefore determined by the convolution:

F,=

The computation of G,(k) is easily visualized as in Fig 8.2 In this figure

we show the values of the functions needed in the calculation of GN (K) The real objective of the algorithm is to determine the last value in the last column because this value is the normalization constant G(K) that we are looking for The values GN (Ic) for k = 1, , K - 1 in the last column are also useful in determining the system performance measures Buzen [Buze71, Buze73] has developed algorithms that are based on the normalization constant G(K) and the functions Fi(ki) to compute all important performance measures of the queueing network without needing the underlying CTMC state probabilities Computation of the performance measures using the normalization con- stant is now illustrated

Trang 6

In [BBS77], a simple algorithm for computing the G:‘(k) for k =

0 , K is presented Since the sum of all marginal probabilities is 1, it’ follows from Eq (8.7):

&(j) = g #Gf(K - j) = 1, j=o

and therefore we have:

K G(K) = xF,(j)+G;)(K-j), j = 1,~ ,N (8.10)

j=o With the help of these equations we can derive an iterative formula for computing the G”)(k), for k = 0 N 7”‘? K* *

G;‘(k) = G(k) - &(j).G$(k - j), (8.11)

j=l

with the initial condition:

In the case of rni = 1 the preceding formulae can be considerably sim- plified

Because of Eq (8.11) we have:

k-l

G:)(k) = G(k) - c Fi(j + l)G;‘(k - 1 - j)

j=O k-l

(b) The throughput of node i in the load-dependent or load-independent case

is given by the formula:

X(K) = G(K - I) G(K- 1)

Trang 7

Proof: By definition, the throughput is given by Eq (7.22):

= &&k - l)G$)(K - /!Y)

k=l K-l

= & c Fi(k)Gg’(K - 1 - Ic) = ei ‘gi)l) q.e.d,

k=O

(c) Th e u z zxa tl’ t ion of a node in the load-independent case can be determined

by inserting Eq (8.14) in the well known relation pi = Xi/(mipi):

ei G(K-1) pi= -

Example 8.1 Consider the following closed queueing network (Fig 8.3) with N = 3 nodes and K = 3 jobs The first node has ml = 2 and the second node has m2 = 3 identical service stations For the third node we have m3 = 1 The service time at each node is exponentially distributed with respective rates:

~1 = 0.8 set-‘, p:! = 0.6 set-‘, p3 = 0.4 set-1 ,

Trang 8

fig 8.3 A closed queueing network

The visit ratios are given as follows:

Thus the normalization constant G(K) = 4.465 The marginal probabilities for the single server node 3 can be computed using Eq (8.13):

Trang 9

r,(2) = (;)2& (G(1) - zG(O)) =0.132,

For the computation of the marginal probabilities of nodes 1 and 2, we need the values G$ (k) for k = 0, 1,2,3 and i = 1,2 With Eq (8.11) we get: G(l)(O) = 1

G&l) = G(1) - R’i(l)G$$)(O) = 1.611,

G:)(2) = G(2) - (Fi(l)Gg)(l) + &(2)G$(O)) = 1.423,

G$)(3) = G(3) - (Fi(l)Gg)(2) + &‘i(2)Gg)(l) + Fi(3)Gc)(O)) = 0.940

In the same way we compute:

G$)(O) = 1 7 Gg)( 1) = 1.656, GC2)(2) = 175 N -2.-.-T GC2)(3) = 1 316 N ** With Eq (8.7) the marginal probabilities are:

Tl(“) = G(3) N F(0)G(1)(3) - 0 211 - i? 7rl(l) = - F1(1) @(2) = 0 398

7rl(2) = - F1(2) G(l) (1) - 0 282

G(3) N - d’ 7ri(3) = - F1 (3) G(l) (0) = 0 109 G(3) N A, and

740) = 0.295, n2(l) = 0.412, 7r2(2) = 0.242, x2.3) = 0.051 The throughputs can be computed using Eq (8.14):

X1 = el g = 0.945, X2 = e2 g = 0.630, X3 = e,g = 0.189 The utilizations are given by Eq (7.21):

p1 = x1 - = 0.590, p2 = _ x2 = 0.350, x3

p3 = K = 0.473 The mean number of jobs at the multiserver nodes is given by Eq (7.26):

771 = 7rl(l) + 27rr(2) + 37rr(3) = 1.290, 1T2 = 7r2(1) + 27r2(2) + 3n2(3) = 1.050, where we use Eq (8.16) for the single server node 3:

E3=(~)gg+(~)2~+(~)3~=o.660

Trang 10

For the mean response time we use Eq (7.43):

-

T1 = x1 = 1.366, T2 = x2 = 1.667, T3 = x3 = 3.496

The convolution method introduced in Section 81.1 can be extended to the case of multiclass closed product-form queueing networks According to the BCMP theorem, Eq (7.80), the state probabilities for closed product-form queueing networks are given by:

$31, ) S,) = & fpio, 2=1 where the functions Fi(Si) are defined in Eq (7.82) The determination of the normalization constant of the network:

G(K) = c fiE(si),

is analogous to the single class case For now, we assume that class switching

is not allowed and thus the number of jobs in each class is constant Corre- sponding to Eq (8.2), we define for k = 0, , K the auxiliary functions:

G(k) = c fi fi(si)

2 S,-ki’l i=l

(8.18)

and determine for n = 1, , N the matrix G, by convolution:

G, = F, @ Gn-1, with the initial condition Gr(.) = Fl(.) For n > 1 we also have:

kc”) =pSi = (Ici”‘, ,$)),

Trang 11

where kp) ,1 5 T < R, is th e overall number of jobs of class r at the nodes 1,2, , n - 1, n We then have:

ky’ = 2 kir and kcN) = K

Equation (8.21) together with the initial conditions Gr(.) = J’r(.) completes the description of the algorithm of [MuWo74b] and [Wong75] For the nor- malization constant we have:

The computation of the performance measures is as follows:

(a) According to Eq (7.32) th e marginal probability that there are exactly

Si = k jobs at node i is given by:

r;(k)= c ~(% &v) = & c fiF,(S,)

Trang 12

Then G:)(K) can again be interpreted as the normalization constant of the network without node i:

(8.24)

In the same way as in the single class case (Eq (8.11)), the iterative formula for computing the normalization constant G:‘(k) is given as follows:

GE)(k) = G(k) - 5 S(j) Fi(j) G$‘(k - j),

j=O

(8.25)

with S(j) defined by:

Ki> = 0, ifj=O, 1, otherwise (8.26)

and affecting the computation only for j = 0 The initial condition is:

GE)(O) = G(0) = 1 7 i = l, ,N (8.27)

(b) The throughput Ai, of class-r jobs at the ith node can be expressed as follows [Wong75] :

where ei, is the solution of Eq (7.72) With (K - lT) = (Kl, , K, -

1 7 * * * , KR), we have the following simpler form:

(8.29)

Trang 13

Fig 8.4 An example of a two-class closed network

The visit ratios are given by:

ell = 1, e21 = 0.4, e31 = 0.4, e41 = 0.2,

el2 = 1, e22 = 0.4, e32 = 0.3, e42 = 0.3

First, with the help of Eq (7.82), the functions Z?i(Si),i = 1,2,3,4, are computed [Table 8.2) Then for determining the normalization constant we compute the G,(kcn)) from Eq (8.21) where Gr(.) = Fi(.) For n = 2 we

Trang 14

have:

G2(l, 0) = G1(0,O)F2(1,0) + G~(l,o)Fz(o, 0) = iii,

G2(0, 1) = Gi(0, O)F2(0, 1) + Gl(O, l)&(O, 0) = 4,

Gz(1, 1) = G1(O,O)F2(Ll> + Gl(L l)F2(0,0>

+ G1(1,0)F2(0, 1) + Gi(O,l)F&O) = 15.6, G2(0,2) = G1(0,0)~2(0,2) + Gl(O, 2)J’2(07 0)

G2(1,2) = G1(0,0)5’2(17 2) + Gl(l, 2)F2(07 0)

+ G1(l, l)Fz(O, 1) + G1(0,1)F2(1,1) + Gl(l,O)F$I, 2) + Gl(0,2)F2(1,0) = 62.4

In the same way we can compute the values for Gs ( kc3)) and G4 (kc4)) as summarized in Table 8.3 So the normalization constant is G(K) = 854.424

Table 8.3 Computation of G,(kci)) z

Now, with the help of Eq (7.47), we can compute the marginal probability

of 0 jobs of class 1 and 2 jobs of class 2 at node 4:

n4(17 I> = G(K) F4(I’ ‘) Gc4)(0 1) = 0 0944 N 7 L7 with Gc4)(0 1) = 7 N ’ -7

Trang 15

The throughputs at each node by class type can be computed with Eq (8.28):

For the computation of the utilizations of each node by class type, Eq (8.29)

is used to get:

p11 = 0.0914, p21 = 0.1463, p31 = 0.2926, p12 = 0.2611, pz2 = 0.2611, /I32 = 0.3917

Trang 16

The algorithm we presented in the preceding text is applicable only to networks without class switching For networks with class switching, [Munt72] proved that a closed queueing network with U ergodic chains is equivalent to

a closed network with U job classes without class switching Therefore, the computation of the normalization constant for networks with class switching

is an extension of the technique for networks without class switching More details are given in [BBS771 and [.BrBa80] and Section 7.3.6 In [ReKo75] and [Saue83] the convolution method has been extended for analyzing open queueing networks with load-dependent service rates and for analyzing closed queueing networks with load-dependent routing probabilities The methods can also be extended for networks with class specific service rates These are networks in which the service rates depend not only on the overall number of jobs of each class at the node, but also on the number of jobs at the node For this case, [LaLi83] modified the convolution algorithm so as to make the best use of storage space Their algorithm is known as tree-convolution Because the computation of the normalization constant can cause numerical problems, other techniques were developed that allow the calculation of the performance measures without using the normalization constant One of the key development in this regard is the mean value analysis (MVA), which we discuss next

8.2 THE MEAN VALUE ANALYSIS

The MVA was developed by Reiser and Lavenberg [ReLa80] for the analysis of closed queueing networks with product-form solution The advantage of this method is that the performance measures can be computed without explicitly computing the normalization constant The method is based on two funda- mental equations and it allows us to compute the mean values of measures

of interest such as the mean waiting time, throughput, and the mean number

of jobs at each node In these computations only mean values are computed (hence the name) For the case of multiserver nodes (mi > 1)) it is necessary, however, to compute the marginal probabilities

The MVA method is based on two simple laws:

1 Little’s theorem, which is introduced in Eq (6.9) to express a rela- tion between the mean number of jobs, the throughput, and the mean response time of a node or the overall system:

-

2 Theorem of the distribution at arrival time (in short, arrival theorem), proven by [LaRe80] and [SeMi81], for all networks that have a product- form solution The arrival theorem says that in a closed product-form queueing network, the pmf of the number of jobs seen at the time of arrival to a node i when there are Ic jobs in the network is equal to the

Trang 17

pmf of the number of jobs at this node with one less job in the network (= (k - 1)) Th is property has an intuitive justification [LZGS84] At the moment a job arrives at a node, it is certain that this job itself is not already in the queue of this node Thus, there are only Ic - 1 other jobs that could possibly interfere with the new arrival The number of these at the node is simply

jobs are in the network

the number there when only those (K - 1)

At first we introduce the MVA for single class closed queueing networks and explain it in more detail This algorithm is then extended to multiclass networks, mixed networks, and networks with load-dependent service rates

8.2.1 Single Class Closed Networks

The fundamental equation of the mean value analysis is based on the arrival theorem for closed product-form networks [ReLa80, SeMi81] and it relates the mean response time of a job at the ith node and the mean number of jobs at that node with one job less in the network, that is:

Ti(K) = 2-m [I +QK - I)] , i = l, ,N

For single server stations (mi = 1) with an FCFS strategy, it is easy to give an intuitive explanation for Eq (8.31) because for each FCFS node i the mean response time Ti(.K) of a job in a network with K jobs is given by the mean service time (l/pi) of that job plus the sum of the mean service times of all jobs that are ahead of this job in the queue Equation (8.31) can also

be derived without using the arrival theorem For this purposes, we use the formulae for computing the utilization (Eq (8.15)) and the mean number of jobs (Eq (8.16)):

ei G(K - 1) dK) = E’ G(K) 7 and:

17i(K) = 2 (~)k*G&?

k=l

From Eq (8.16) it follows:

If we transform Eq (8.15) for G(K-1) and insert the result of the preceding equation, then, after rearranging the equation, we get:

pi(K)*?7i(K - 1) = 5 (E)” * G$i)li) 7

Trang 18

and substituting this equation to Eq (8.16), we have:

Ki(K) = 1;' (-qK) + Pi(h’)Z(K - 1)

= Pik) + Pi(K)-&(K - 1) = pi(K)- [l + K;(K - l)]

If we assume constant service rates and rni = 1, then we get the desired result

by using pi(K) = Xi(K)/pi and Little’s theorem:

(8.32)

whereas the other one determines the mean number of jobs at the ith node:

-

where ei is the visit ratio at the ith node

The three equations, (8.31), (8.32), and (8.33), allow an iterative compu- tation of the mean response time, mean number of jobs, and throughput of the closed product-form queueing network The iteration is done over the number of jobs Ic in the network Equation (8.31) is valid for FCFS single server nodes, PS-nodes, and LCFS PR-nodes The description of the MVA

is complete if we extend Eq (8.31) to the case of IS-nodes and FCFS-nodes with multiple servers In the case of IS-nodes we have:

Ti(K) = ;

For the latter case, consider a job that arrives at -/M/m-FCFS node con- taining j - 1 jobs, given that the network population is Ic - 1 This event occurs with probability 7ri(j - 1 1 Ic - 1) Then we have:

Trang 19

To obtain an expression for the probability ni(j ( F;), we use the formulae pre- sented in the convolution algorithm for computing the performance measures with help of the normalization constant, namely:

ni(jp)=Fi(j)* G’~~k~ ‘) * (see Eq (8.7))

Initialization For i = 1, , iV and j = 1, , (mi - 1):

Xi(O) = 0, Ti(O 1 0) = 1, Xi(j I 0) = 0

Trang 20

Iteration over the number of jobs k = 1, , K

L STEP 2 I Fori=l, , N, compute the mean response time of a job at the ith node:

’ (*i > I),

Type-37 (8.38) where the conditional probabilities are computed using Eqs (8.37) and (8.36) [STEP] Compute the overall throughput:

where the ei can be determined with Eq (7.5)

I STEP 2 3 For i = 1, , N, compute the mean number of jobs at the ith node:

The other performance measures, e.g., utilization, mean waiting time, mean queue length, etc., can be derived from the calculated measures using the well- known equations

The disadvantage of the MVA is its extremely high memory requirement The memory requirement can be considerably reduced by using approxima- tion techniques (e.g., SCAT, or self-correcting approximation technique), dis- cussed in Chapter 9 Another disadvantage of the MVA is the fact that it

is not possible to compute state probabilities In [AkBo83], the MVA has been extended for computing the normalization constant and for computing the state probabilities Only Step 2.2 needs to be extended to include the equation:

G(k - 1)

with the initial condition G(0) = 1 This follows immediately from Eq (8.14) When the iteration stops, we have the normalization constant G(K) that

Trang 21

can be used to compute the state probabilities with the help of the BCMP theorem, Eq (7.80) T wo applications of the mean value analysis are now given

Example 8.3 The central-server model shown in Fig 8.5 has N = 4 nodes and K = 6 jobs The service time of a job at the ith node, i = 1, 2, 3, 4, is exponentially distributed with the following mean values:

1

- = O.O2sec, - 1 = 0.2sec, - 1 = 0.4sec, - 1 = 0.6sec

-cYutG- L,H3 PI lIIn-@- )

-clYIE+

Fig 8.5 The central-server model

The visit ratios are given as follows:

Iteration over the number of jobs in the network starting with

m Mean response times, Eq (8.38):

E(1) = ; [l -t%(o)] = 0.02, T2(1) = ; [l +Kz(O)] = 0.2,

- T3(1)= ; [l+K3(0)] =0.4, T4(1) = ; [l+K4(0)] =oJ%

1-1 Throughput, Eq (8.39)) and normalization constant, Eq (8.42):

X(l)= * l = 4.167,

C &(l)

G(1) = # = 0.24

i=l

Trang 22

Tab/e 8.4 Performance measures after completing six iteration

-1 Mean number of jobs, Eq (8.41):

El(l) = X(l)Ti(l)ei = 0.083, Kz(l) = X(l)Tz(l)ez = 0.333, E,(l) = X(l)Ts(l)es = 0.333, F4(1) = X(l)Td(l)e4 = 0.25

STEP 2 l 3 Mean number of jobs:

Ki(2) = X(2)?;1(2)el = 0.140, 1?2(2) = A(2)T2(2)e2 = 0.688, z,(2) = X(2)F3(2)e3 = 0.688, K4(2) = A(2)T*(2)e4 = 0.484

After six steps, the iteration stops and we get the final results as sum- marized in the Table 8.4 The normalization constant of the network is G(K) = 5.756 + lo-“ With Eq (7.80) we can compute the steady-sate prob- ability that the network is, for example, in the state (3,1,1,1):

h

= 5.337 10-4

Trang 23

Fig 8.6 A closed queueing network

Example 8.4 As anot her example, consider the closed queueing network given in Fig 8.6 with K = 3 jobs At the first node we have ml = 2 identical processors having exponentially distributed service times with mean l/,~r = 0.5 sec Node 2 and node 3 have exponentially distributed service times with means l/p2 = 0.6 set and l/bs = 0.8 set, respectively At the fourth node (terminals), the mean service time is l/p* = 1 sec The routing probabilities are as follows:

mo) = K2(0) = K3(0) = 0, x1(0 IO) = 1, 7rl(l IO) = 0

Iteration over the number of jobs in the network starting with

I-2.1 Mean response times:

Ti(1) = & [1+ K(0) + v(o 1 O)] = o.s, Tz(1) = ; [l + F2(0)] = 0.6,

- T3(1) = i [1+ K3(0)] = 0.8, T4(1) = ; =A

Trang 24

-1 Mean number of jobs:

El(l) = ~(l)~l(l)el = 0.227, z,(l) = X(l)Tz(l)ez = 0.136, Es(l) = X(l)Fs(l)es = 0.182, ??~(l) = X(l)FA(l)ed = 0.454 Iteration for k = 2:

1 = 0.617,

Trang 25

and:

w(l 12) = 32)711(0 1 1) = 0.334,

%(3) = ; [l -t-&(2)] = 0.776, T3(3) = i [l +??,(a)] = 1.127, Ej(3) = 1

/?!?%~~ Mean number of jobs:

E,(3) = X(3)T1(3)el = 0.624, 17,(3) = X(3)&(3)e2 = 0.473, KS(~) = X(3)T3(3)e3 = O.686! z*(3) = X(3)F4(3)e4 = 1.217

The throughput at each node can be computed with Eq (8.40):

X1 = X(3) el = 1.218, X2 = X(3) e2 = 0.609, X3 = X(3) e3 = 0.609, X4 = X(3) e4 = 1.218

For determining the utilization of each node, we use Eq (7.21):

h

m1p1 P2 = z = 0.365, p3 = - = ().487

P3 -

The algorithm for computing the performance measures of single class closed queueing networks can easily be extended to the multiclass case in the follow- ing way [ReLa80] :

Initialization For i = 1, , N, T = 1, , R, j = 1, , (mi - 1):

J&(O,O ,O) =o, 7ri(O IO) = 1, %(j IO> =a

IteraCon: k = 0, , K:

Trang 26

[ STEP 2.1 1 For i = 1, , N and r = 1, , R, compute the mean response time of class-r jobs at the ith node:

Type-1,2,4 (mi = I>,

’ & [ 1+ e Eis(k - 1,) s=l 1

1 [

and for j = 0 by:

?ri(O]k)=l-; 5 %X,(k) + mgl(mi - j)r&’ ] k)

where ei, can be computed by Eq (7.72)

1 STEP 2.2 ]Forr=l, , R, compute the throughput:

of class-r jobs at the ith node:

With these extensions the MVA algorithm for multiclass closed product-form queueing networks is completely specified

Akyildiz and Belch [AkBo83] extended the MVA for computing the nor- malization constant and the state probabilities For this purpose, Step 2.2 in the preceding algorithm is expanded to include the formula:

G(k) = G(k - 1,)

Trang 27

with the initial condition G(0) = 1 After the iteration stops we get the normalization constant G(K) that can be used to compute the steady-state probabilities with the help of the BCMP theorem, Eq (7.80)

Fig 8.7 Sequence of intermediate values

To explain iteration Step 2, we use Fig 8.7 where the sequence of inter- mediate values is shown for a network with R = 2 job classes The first job class has K1 = 2 jobs and the second class has K2 = 3 jobs The mean value algorithm starts with the trivial case where no job is in the system, that is, the population vector (0,O) From this, solutions for all population vectors that consist of exactly one job are computed; in our example these are the population vectors (1,0) and (0,l) Then the solution for all population vec- tors with exactly two jobs are computed, and so on, until the final result for

K = (2,3) is reached In general, to compute the solution for a population vector k we need R intermediate solutions as input, namely the solutions for all population vectors k - l,, r = 1, , R

Now, the algorithm is illustrated by means of an example:

Example 8.5 Consider the queueing network shown in Fig 8.8 with N =

3 nodes and R = 2 job classes Class 1 contains K1 = 2 jobs and class 2 contains Kz = 1 jobs Class switching of the jobs is not allowed The service time at the ith, i = 1, 2, 3, node is exponentially distributed with mean values:

Trang 28

Fig 8.8 Another closed queueing network

The visit ratios are given as follows:

ell = 1, e21 = 0.6, e31 = 0.4, el2 = 1, e22 = 0.3, e32 = 0.7 The queueing discipline at node 1 is FCFS and at node 2, processor sharing The terminal station is modeled as an IS-node We analyze the network using the MVA in the following three steps:

Initialization:

Xi,(O) = 0 f or i = 1,2,3 and T = 1,2, 7r1(0 1 0) = 1, x1(1 1 0) = 0

Iterate over the number of jobs in the network beginning with the population vector k = (1,O):

-1 Mean response times, Eq (8.43):

%(l,O) = & [I + %l(O, 0) + %z(O,O> + m(0 I 0, o,] = 0.2, T&O) = ; [1+~21(0,0) +~22(0,0)] = 0.4,

Trang 29

-1 Mean number of jobs, Eq (8.47):

E,,(i,O) = X1(l,O)T&O)ell = 0.238,

~,,(i,O) = X1(l,0)~21(l,0)e2~ = 0.286, E31(1, 0) = X1(1, O)T31(1,O)e31 = 0.476

Iteration for k = (0,l):

~12(0,1> = & [I + %(o, 0) + 1712(0,0) + m(0 I 0, O)] = 0.2,

Note that the class 1 throughput for this population vector is 0

E,,(O,l) = X2(0, 1)552(0,l)e12 = 0.112,

F22(0,i) = X~(O,l)T22(0,Q322 = 0.101, ES2(0, 1) = X2(0,1)5732(0,1)e32 = 0.787

[

%,(o, 1) + Tl(1 1 0,l)

Trang 30

m(l I07 1) = %,(O, 1)7Tl(O ( 0,O) = 0.112,

?f22(1, 1) = X2(1,1)T22(1,+22 = 0.126, F&(1,1) = X1(1,l)T31(l,l)e3l = !LB&

K32(I, I) = X2(1, I)T32(1,l)e32 = O-764

Trang 31

Iteration for k = (2,0):

%(2,0> = & [1+~11(1,0) +n(O 1 LO)] =0.2,

p?%?$3 Mean number of jobs:

S&(2,0) = X1(2,O)T& O)ell = 0.440, xz1(2, 0) = &(2,O)Tz1(2,O)ezl = 0.679, KS1(2, 0) = X1(2, 0)‘?‘,1(2,O)e31 = 0.881

v(l I 171) = y$‘(l, 1)7rl(O 1 0,l) + EX?(l, l)*m(O I LO) = 0.289,

5%(2,1> = ; [l +E& 1) +Ezz(l, I)] =

Class 2:

T12(2,1> = & [l +E,1(2,0) + 7rl(O 12,0)] = 0.205,

Trang 32

of roundoff errors, and large storage requirements The storage requirement

is 0 (N n,“=, (K, + 1)) For comparison, the storage requirement of the convolution algorithm is 0 (n,“=,(Kr + 1)) The time requirement in both cases is approximately the same: 2 R(N - 1) fi (K, + 1) [CoGe86] Recall

Trang 33

we consider the arrival theorem for open queueing networks This theorem says that the probability that a job entering node i will find the network in state (ICI , Ici, , ICN) is equal to the steady-state probability for this state This theorem is also called PASTA theorem r Therefore we have for the mean response time of a job in an open network:

If we index the open job classes with op = 1, , OP and the closed job classes with cl = 1, , CL, then we have for the mean number of jobs in an open class T, taking into account Eq (8.50):

Trang 34

where k is the population vector of the closed job classes Equation (8.51) is valid for Type-1,2,4 single server nodes, while for Type-3 nodes we have:

With these formulae the MVA for mixed product-form queueing networks can

be described completely We assume for now that all arrival and service rates are load independent

The algorithm is as follows:

Initialization For all nodes i = 1, , N, compute the utilization

of the open class jobs op = 1, , OP in the mixed network:

1 Pi,op = -X0,* ei,0p,

Pi,op

(8.55)

- and check for the ergodicity condition (pi,OP 5 1) Set Ki,cl(O) = 0 for all i= I, , N and all closed classes CL = 1, , CL

Construct a closed queueing network that contains only the jobs

of the closed job classes and solve the model with the extended version of the MVA, which considers also the influence of jobs of the open job classes The results are the performance measures for the closed job classes of the mixed queueing network

Ngày đăng: 21/01/2014, 20:20