1. Trang chủ
  2. » Khoa Học Tự Nhiên

Handbook of mathematics for engineers and scienteists part 159 potx

7 260 0
Tài liệu đã được kiểm tra trùng lặp

Đang tải... (xem toàn văn)

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 7
Dung lượng 361,55 KB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

and Engelhardt, M., Introduction to Probability and Mathematical Statistics, 2nd Edition Duxbury Classic, Duxbury Press, Boston, 2000.. and May, D., Handbook of Probability and Statistic

Trang 1

For the integral (20.4.1.10) to exist, it is necessary and sufficient that the following limit exist:

lim

λ→0

n



k=1

n



l=1

B ξξ (s k , s l )(t k – t k–1)(t l – t l–1),

where λ = max

k (t k – t k–1).

In particular, the integral (20.4.1.10) exists if the following repeated integral exists:

 b

a

 b

a B ξξ (s, t) dt ds.

20.4.2 Models of Stochastic Processes

20.4.2-1 Stationary stochastic process

A stochastic process ξ(t) is said to be stationary if its probability characteristics remain the same in the course of time, i.e., are invariant under time shifts t → t + a, ξ(t) → ξ(t + a)

for any given a (real or integer for a stochastic process with continuous or discrete time,

respectively)

For a stationary process, the mean value (the expectation)

E{ξ (t)}= E{ξ(0)}= m

is a constant, and the correlation function is determined by the relation

E{ξ (t)ξ(t + τ )}= B ξξ (τ ), (20.4.2.1)

where ξ(t)is the function conjugate to a function ξ(t) The correlation function is positive

definite:

n



k=1

n



j=1

c k c j B ξξ (t k – t j ) = En

k=1

c k ξ (t k)4

≥ 0

In this case, the following relations hold:

B ξξ (τ ) = B ξξ (–τ ), B ξζ (τ ) = B ζξ (–τ )

|B ξξ (τ )| ≤B ξξ(0), |B ξζ (τ )|2 ≤B ξξ(0)B ζζ(0), (20.4.2.2)

where B ξξ (s, t) and B ζξ (s, t) are the function conjugate to a function B ξξ (s, t) and B ζξ (s, t),

respectively

Stochastic processes for which E{ξ (t)}and E{ξ (t)ξ(t + τ )}are independent of t are called stationary stochastic processes in the wide sense Stochastic processes, all of whose characteristics remain the same in the course of time, are called stationary stochastic processes in the narrow sense.

KHINCHIN’S THEOREM The correlation function B ξξ (τ ) of a stationary stochastic process with continuous time can always be represented in the form

B ξξ (τ ) =



∞ e

where F (λ) is a monotone nondecreasing function, i is the imaginary unit, and i2= –1

Trang 2

If B ξξ (τ ) decreases sufficiently rapidly as|τ|→ ∞ (as happens most often in applications

provided that ξ(t) is understood as the difference ξ(t) – E{ξ (t)}, i.e., it is assumed that

E{ξ (t)} = 0), then the integral on the right-hand side in (20.4.2.3) becomes the Fourier integral

B ξξ (τ ) =



∞ e

where f (λ) = F λ  (λ) is a monotone nondecreasing function The function F (λ) is called the spectral function of the stationary stochastic process, and the function f (λ) is called its spectral density The process ξ(t) itself admits the spectral resolution

ξ (t) =



∞ e

where Z(λ) is a random function with uncorrelated increments (i.e., a function such that

E5

dZ (λ1) dZ(λ2)6

=0for λ1≠λ2) satisfying the condition|E{dZ (λ)}|2= dF (λ) and the

integral is understood as the mean-square limit of the corresponding sequence of integral sums

20.4.2-2 Markov processes

A stochastic process ξ(t) is called a Markov process if for two arbitrary times t0 and t1,

t0< t1, the conditional distribution of ξ(t1) given all values of ξ(t) for tt0depends only

on ξ(t0) This property is called the Markov property or the absence of aftereffect.

The probability of a transition from state i to state j in time t is called the transition probability p ij (t) (t≥ 0) The transition probability satisfies the relation

p ij (t) = P [ξ(t) = j|ξ(0) = i]. (20.4.2.6) Suppose that the initial probability distribution

p0

i = P



ξ(0) = i

, i=0, 1, 2,

is given In this case, the joint probability distribution of the random variables ξ(t1), ,

ξ (t n) for any0= t0< t1<· · · < t nis given by

P

ξ (t1) = j1, , ξ(t n ) = j n

i

p0

i p ij1(t1–t0)p ij2(t2–t1) p j n–1j n (t n –t n–1); (20.4.2.7)

and, in particular, the probability that the system at time t >0is in state j is

p j (t) =



i

p0

i p ij (t), j=0, 1, 2, The dependence of the transition probabilities p ij (t) on time t≥ 0is given by the formula

p ij (s + t) =



k

p ik (s)p kj (t), i , j =0, 1, 2, (20.4.2.8)

Suppose that λ ij = [p ij (t)]  t

t=0, j = 0, 1, 2, The parameters λ ij satisfy the condition

λ ii= limh→0p ii (h) –1

ij

λ ij, λ ij = limh→0p ij h (h) ≥ 0 (ij) (20.4.2.9)

Trang 3

THEOREM Under condition (20.4.2.9), the transition probabilities satisfy the system of differential equations

[p ij (t)]  t=

k

λ ik p kj (t), i , j =0, 1, 2, (20.4.2.10)

The system of differential equations (20.4.2.10) is called the system of backward Kol-mogorov equations.

THEOREM The transition probabilities p ij (t)satisfy the system of differential equations

[p ij (t)]  t=

k

λ kj p ik (t), i , j =0, 1, 2, (20.4.2.11)

The system of differential equations (20.4.2.11) is called the system of forward Kol-mogorov equations.

20.4.2-3 Poisson processes

For a flow of events, letΛ(t) be the expectation of the number of events on the interval

[0, t) The number of events in the half-open interval [a, b) is a Poisson random variable

with parameter

Λ(b) – Λ(a).

The probability structure of a Poisson process is completely determined by the function

Λ(t).

The Poisson process is a stochastic process ξ(t), t≥ 0, with independent increments having the Poisson distribution; i.e.,

P [ξ(t) – ξ(s) = k] = [Λ(t) – Λ(s)] k

Λ(t)–Λ(s)

for all0 ≤st , k =0,1,2, , and t≥ 0

A Poisson point process is a stochastic process for which the numbers of points (counting

multiplicities) in any disjoint measurable sets of the phase space are independent random variables with the Poisson distribution

In queueing theory, it is often assumed that the incoming traffic is a Poisson point

process The simplest point process is defined as the Poisson point process characterized

by the following three properties:

1 Stationarity

2 Memorylessness

3 Orderliness

Stationarity means that, for any finite group of disjoint time intervals, the probability

that a given number of events occurs on each of these time intervals depends only on these numbers and on the duration of the time intervals, but is independent of any shift of all time

intervals by the same value In particular, the probability that k event occurs on the time interval from τ to τ + t is independent of τ and is a function only of the variables k and t Memorylessness means that the probability of the occurrence of k events on the time interval from τ to τ +t is independent of how many times and how the events occurred earlier.

This means that the conditional probability of the occurrence of events on the time interval

from τ to τ +t under any possible assumptions concerning the occurrence of the events before time τ coincides with the unconditional probability In particular, memorylessness means

that the occurrences of any number of events on disjoint time intervals are independent

Orderliness expresses the requirement that the occurrence of two or more events on a

small time interval is practically impossible

Trang 4

20.4.2-4 Birth–death processes.

Suppose that a system can be in one of the states

E0, E1, E2, ,

and the set of these states is finite or countable In the course of time, the states of the

system vary; on a time interval of length h, the system passes from the state E nto the state

E n+1with probability

λ n h + o(h) and to the state E n–1with probability

υ n h + o(h).

The probability to stay at the same state E n on a time interval of length h is equal to

1– λ n h – υ n h + o(h).

It is assumed that the constants λ n and υ n depend only on n and are independent of t and

of how the system arrived at this state

The stochastic process described above is called a birth–death process If the relation

υ n=0

holds for any n≥ 1, then the process is called a pure birth process If the relation

λ n=0

holds for any n≥ 1, then the process is called the death process.

Let p k (t) be the probability that the system is in the state E k at time t Then the

birth–death process is described by the system of differential equations

[p0(t)]  t = –λ0p0(t) + υ1p1(t),

[p k (t)]  t = –(λ k + υ k )p k (t) + λ k–1p k–1(t) + υ k+1p k+1(t), k≥ 1 (20.4.2.12)

Example 1 Consider the system consisting of the states E0and E1 The system of differential equations

for the probabilities p0(t) and p1(t) has the form

[p0(t)]  t = –λp0(t) + υp1(t), [p1(t)]  t = λp0(t) – υp1(t).

The solution of the system of equations with the initial conditions p0 ( 0 ) = 1, p1 ( 0 ) = 0 has the form

[p0(t)]  t= υ

υ + λ

*

1 +υ

λ e

–(υ+λ)t+

,

[p1(t)]  t= λ

υ + λ

*

1 – υ

λ e

–(υ+λ)t+

.

FELLER THEOREM For the solution p k (t) of the pure birth equations to satisfy the



k=0

for all t, it is necessary and sufficient that the following series diverge:



k=0

1

Trang 5

In the case of a pure birth process, the system of equations (20.4.2.11) can be solved by simple successive integration, because the differential equations have the form of simple

recursive relations In the general case, it is already impossible to find the function p k (t)

successively

The relation



k=0

p k (t) =1

holds for all t if the series



k=1

k



i=1

υ i

diverges If, in addition, the series



k=1

k



i=1

λ i–1

converges, then there exist limits

for all t.

If relation (20.4.2.17) holds, then system (20.4.2.12) becomes

– λ0p0+ υ1p1=0,

– (λ k + υ k )p k + λ k–1p k–1+ υ k+1p k+1 =0, k≥ 1 (20.4.2.18) The solutions of system (20.4.2.18) have the form

p k= λ υ k–1

k p k–1=

k



i=1

λ i–1

υ i p0. (20.4.2.19)

The constant p0is determined by the normalization condition

k=0p k (t) =1:

p0 =



1+



k=1

k



i=1

λ i–1

υ i



Example 2 Servicing with queue.

A Poisson flow of jobs with parameter λ arrives at n identical servers A server serves a job in random

time with the probability distribution

H (x) =1– eυx.

If there is at least one free server when a job arrives, then servicing starts immediately But if all servers are occupied, then the new jobs enter a queue The conditions of the problem satisfy the assumptions of the

theory of birth–death processes In this problem, λ k = λ for any k, υ k = kυ for kn , and υ k = nυ for kn.

By formulas (20.4.2.19) and (20.4.2.20), we have

p k=

ρ k

k!p0 for kn,

ρ k

n ! n k–n p0 for kn,

where ρ = λ/υ.

Trang 6

The constant p0 is determined by the relation

p0=

 n k=0

ρ k

k! +

ρ

n!



k=n+1

n

k–n– 1

.

If ρ < n, then

p0 =



1 +

n



k=1

ρ k

k! +

ρ n+1

n ! (n – ρ)!

 – 1

.

But if ρn , the series in the parentheses is divergent and p k = 0for all k; i.e., in this case the queue to be

served increases in time without bound.

Example 3 Maintenance of machines by a team of workers.

A team of l workers maintains n identical machines The machines fail independently; the probability of

a failure in the time interval (t, t + h) is equal to λh + o(h) The probability that a machine will be repaired

on the interval (t, t + h) is equal to υh + o(h) Each worker can repair only one machine; each machine can

be repaired only by one worker Find the probability of the event that a given number of machines is out of operation at a given time.

Let E k be the event that exactly k machines are out of operation at a given time Obviously, the system can be only in the states E0, , E n We deal with a birth–death process such that

λ k=

(n – k)λ for0 ≤k < n,



for 1 ≤k < l,

lυ for lkn.

By formulas (20.4.2.19) and (20.4.2.20), we have

p k=

n!

k ! (n – k)! ρ

k p

0 for 1 ≤kl,

n!

l n–k l ! (n – k)! ρ

k p

0 for lkn,

where ρ = λ/υ The constant p0 is determined by the relation

p0 =

l

k=0

n!

k ! (n – k)! ρ

k+

n



k=l+1

n!

l n–k l ! (n – k)! ρ

k – 1

.

References for Chapter 20

Bain, L J and Engelhardt, M., Introduction to Probability and Mathematical Statistics, 2nd Edition (Duxbury

Classic), Duxbury Press, Boston, 2000.

Bean, M A., Probability: The Science of Uncertainty with Applications to Investments, Insurance, and

Engi-neering, Brooks Cole, Stamford, 2000.

Bertsekas, D P and Tsitsiklis, J N., Introduction to Probability, Athena Scientific, Belmont, Massachusetts,

2002.

Beyer, W H (Editor), CRC Standard Probability and Statistics Tables and Formulae, CRC Press, Boca Raton,

1990.

Burlington, R S and May, D., Handbook of Probability and Statistics With Tables, 2nd Edition, McGraw-Hill,

New York, 1970.

Chung, K L., A Course in Probability Theory Revised, 2nd Edition, Academic Press, Boston, 2000.

DeGroot, M H and Schervish, M J., Probability and Statistics, 3rd Edition, Addison Wesley, Boston, 2001 Devore, J L., Probability and Statistics for Engineering and the Sciences (with CD-ROM and InfoTrac), 6th

Edition, Duxbury Press, Boston, 2003.

Freund, J E., Introduction to Probability, Rep Edition (Dover Books on Advanced Mathematics), Dover

Publications, New York, 1993.

Garcia, L., Probability and Random Processes for Electrical Engineering: Student Solutions Manual, 2nd

Edition, Addison Wesley, Boston, 1993.

Gnedenko, B V and Khinchin, A Ya., An Elementary Introduction to the Theory of Probability, 5th Edition,

Dover Publications, New York, 1962.

Ghahramani, S., Fundamentals of Probability, with Stochastic Processes, 3rd Edition, Prentice Hall,

Engle-wood Cliffs, New Jersey, 2004.

Goldberg, S., Probability: An Introduction, Dover Publications, New York, 1987.

Trang 7

Grimmett, G R and Stirzaker, D R., Probability and Random Processes, 3rd Edition, Oxford University

Press, Oxford, 2001.

Hines, W W., Montgomery, D C., Goldsman, D M., and Borror, C M., Probability and Statistics in

Engineering, 4th Edition, Wiley, New York, 2003.

Hsu, H., Schaum’s Outline of Probability, Random Variables, and Random Processes, McGraw-Hill, New

York, 1996.

Kokoska, S and Zwillinger, D (Editors), CRC Standard Probability and Statistics Tables and Formulae,

Student Edition, Chapman & Hall/CRC, Boca Raton, 2000.

Lange, K., Applied Probability, Springer, New York, 2004.

Ledermann, W., Probability (Handbook of Applicable Mathematics), Wiley, New York, 1981.

Lipschutz, S., Schaum’s Outline of Probability, 2nd Edition, McGraw-Hill, New York, 2000.

Lipschutz, S and Schiller, J., Schaum’s Outline of Introduction to Probability and Statistics, McGraw-Hill,

New York, 1998.

Mendenhall, W., Beaver, R J., and Beaver, B M., Introduction to Probability and Statistics (with CD-ROM),

12th Edition, Duxbury Press, Boston, 2005.

Milton, J S and Arnold, J C., Introduction to Probability and Statistics: Principles and Applications for

Engineering and the Computing Sciences, 2nd Edition, McGraw-Hill, New York, 2002.

Montgomery, D C and Runger, G C., Applied Statistics and Probability for Engineers, Student Solutions

Manual, 4th Edition, Wiley, New York, 2006.

Pfeiffer, P E., Concepts of Probability Theory, 2nd Rev Edition, Dover Publications, New York, 1978 Pitman, J., Probability, Springer, New York, 1993.

Ross, S M., Applied Probability Models with Optimization Applications (Dover Books on Mathematics), Rep.

Edition, Dover Publications, New York, 1992.

Ross, S M., Introduction to Probability Models, 8th Edition, Academic Press, Boston, 2002.

Ross, S M., A First Course in Probability, 7th Edition, Prentice Hall, Englewood Cliffs, New Jersey, 2005 Rozanov, Y A., Probability Theory: A Concise Course, Dover Publications, New York, 1977.

Scheaffer, R L and McClave, J T., Probability and Statistics for Engineers, 4th Edition (Statistics), Duxbury

Press, Boston, 1994.

Seely, J A., Probability and Statistics for Engineering and Science, 6th Edition, Brooks Cole, Stamford, 2003 Shiryaev, A N., Probability, 2nd Edition (Graduate Texts in Mathematics), Springer, New York, 1996.

Ventsel, H., Th´eorie des Probabilit´es, Mir Publishers, Moscow, 1987.

Ventsel, H and Ovtcharov, L., Probl´emes Appliqu´es de la Th´eorie des Probabilit´es, Mir Publishers, Moscow,

1988.

Walpole, R E., Myers, R H., Myers, S L., and Ye, K., Probability & Statistics for Engineers & Scientists,

8th Edition, Prentice Hall, Englewood Cliffs, New Jersey, 2006.

Yates, R D and Goodman, D.J., Probability and Stochastic Processes: A Friendly Introduction for Electrical

and Computer Engineers, 2nd Edition, Wiley, New York, 2004.

Zwillinger, D (Editor), CRC Standard Mathematical Tables and Formulae, 31st Edition, Chapman &

Hall/CRC, Boca Raton, 2001.

Ngày đăng: 02/07/2014, 13:20

TỪ KHÓA LIÊN QUAN

🧩 Sản phẩm bạn có thể quan tâm