Using Random Numbers to Evaluate Integrals

Một phần của tài liệu Sheldon m ross (eds ) simulation academic press (2012) (Trang 40 - 47)

One of the earliest applications of random numbers was in the computation of integrals. Letg(x)be a function and suppose we wanted to computeθwhere

θ=

1 0

g(x)d x

3.2 Using Random Numbers to Evaluate Integrals 41 To compute the value ofθ, note that ifUis uniformly distributed over (0, 1), then we can expressθas

θ =E[g(U)]

IfU1, . . . ,Ukare independent uniform (0, 1) random variables, it thus follows that the random variables g(U1), . . . ,g(Uk) are independent and identically distributed random variables having meanθ. Therefore, by the strong law of large numbers, it follows that, with probability 1,

k i=1

g(Ui)

kE[g(U)]=θ ask→ ∞

Hence we can approximateθ by generating a large number of random numbers uiand taking as our approximation the average value ofg(ui). This approach to approximating integrals is called theMonte Carloapproach.

If we wanted to compute

θ=

b

a

g(x)d x

then, by making the substitutiony=(xa)/(ba), d y=d x/(ba), we see that

θ=

1 0

g(a+[ba]y)(ba)d y

= 1

0

h(y)d y

whereh(y)=(ba)g(a+[ba]y). Thus, we can approximateθby continually generating random numbers and then taking the average value ofh evaluated at these random numbers.

Similarly, if we wanted

θ=

0

g(x)d x

we could apply the substitutiony=1/(x+1),d y= −d x/(x+1)2= −y2d x, to obtain the identity

θ=

1

0

h(y)d y

where

h(y)= g 1

y −1 y2

The utility of using random numbers to approximate integrals becomes more apparent in the case of multidimensional integrals. Suppose thatgis a function with ann-dimensional argument and that we are interested in computing

θ =

1 0

1 0

. . . 1

0

g(x1, . . . ,xn)d x1d x2ã ã ãd xn

The key to the Monte Carlo approach to estimateθ lies in the fact thatθ can be expressed as the following expectation:

θ=E[g(U1, . . . ,Un)]

where U1, . . . ,Un are independent uniform (0, 1) random variables. Hence, if we generatekindependent sets, each consisting ofnindependent uniform (0, 1) random variables

U11, . . . ,Un1 U12, . . . ,Un2

...

U1k, . . . ,Unk

then, since the random variablesg(U1i, . . . ,Uni),i =1, . . . ,k, are all independent and identically distributed random variables with meanθ, we can estimateθ by k

i=1g(U1i, . . . ,Uni)/k.

For an application of the above, consider the following approach to estimatingπ.

Example 3a The Estimation of π Suppose that the random vector (X,Y)is uniformly distributed in the square of area 4 centered at the origin. That is, it is a random point in the region specified in Figure3.1. Let us consider now the probability that this random point in the square is contained within the inscribed circle of radius 1 (see Figure3.2). Note that since(X,Y)is uniformly distributed in the square it follows that

P{(X,Y)is in the circle} = P{X2+Y2⩽1}

= Area of the circle Area of the square =π

4

Hence, if we generate a large number of random points in the square, the proportion of points that fall within the circle will be approximatelyπ/4. Now ifXandYwere independent and both were uniformly distributed over(−1,1), their joint density would be

f(x,y)= f(x)f(y)

= 1 2 ã1

2

= 1

4, −1⩽x ⩽1, −1⩽y⩽1

3.2 Using Random Numbers to Evaluate Integrals 43 (−1, 1)

(1, −1) (−1, −1)

(1, 1)

= (0, 0) Figure 3.1. Square.

(−1, 1)

(1, −1) (−1, −1)

(1, 1)

= (0, 0)

Figure 3.2. Circle within Square.

Since the density function of(X,Y)is constant in the square, it thus follows (by definition) that(X,Y)is uniformly distributed in the square. Now ifUis uniform on (0, 1) then 2U is uniform on (0, 2), and so 2U −1 is uniform on (−1,1). Therefore, if we generate random numbersU1 andU2, set X = 2U1 −1 and Y =2U2−1, and define

I =

1 if X2+Y2⩽1 0 otherwise

then

E[I]=P{X2+Y2⩽1} = π 4

Hence we can estimate π/4 by generating a large number of pairs of random numbersu1,u2 and estimating π/4 by the fraction of pairs for which

(2u1−1)2+(2u2−1)2⩽1.

Thus, random number generators can be used to generate the values of uniform (0, 1) random variables. Starting with these random numbers we show in Chapters 4 and 5 how we can generate the values of random variables from arbitrary distributions. With this ability to generate arbitrary random variables we will be able to simulate a probability system—that is, we will be able to generate, according to the specified probability laws of the system, all the random quantities of this system as it evolves over time.

Exercises

1. Ifx0=5 and

xn =3xn−1mod 150 findx1, . . . ,x10.

2. Ifx0=3 and

xn=(5xn−1+7)mod 200 findx1, . . . ,x10.

In Exercises 3–9 use simulation to approximate the following integrals.

Compare your estimate with the exact answer if known.

3. 1

0 exp{ex}d x 4. 1

0(1−x2)3/2d x 5. 2

−2ex+x2d x 6. ∞

0 x(1+x2)−2d x 7. ∞

−∞ex2d x 8. 1

0

1

0 e(x+y)2d y d x 9. ∞

0

x

0 e(x+y)d y d x [Hint:LetIy(x)=

1 if y<x

0 if yx and use this function to equate the integral to one in which both terms go from 0 to∞.]

10. Use simulation to approximate Cov(U,eU), whereUis uniform on (0, 1).

Compare your approximation with the exact answer.

Bibliography 45

11. LetUbe uniform on (0, 1). Use simulation to approximate the following:

(a) Corr

U,√ 1−U2

. (b) Corr

U2,

1−U2

.

12. For uniform (0, 1) random variablesU1,U2, . . .define

N =Minimum

n:

n i=1

Ui >1

That is,Nis equal to the number of random numbers that must be summed to exceed 1.

(a) Estimate E[N] by generating 100 values ofN.

(b) EstimateE[N] by generating 1000 values ofN.

(c) Estimate E[N] by generating 10,000 values ofN.

(d) What do you think is the value ofE[N]?

13. LetUi,i ⩾1, be random numbers. DefineNby

N =Maximum

n: n

i=1

Uie−3 where0

i=1Ui≡1.

(a) FindE[N] by simulation.

(b) FindP{N=i}, fori=0,1,2,3,4,5,6, by simulation.

14. Withx1=23,x2=66, and

xn=3xn−1+5xn−2 mod(100), n ⩾3

we will call the sequenceun = xn/100,n ⩾ 1, thetext’s random number sequence. Find its first 14 values.

Bibliography

Knuth, D.,The Art of Computer Programming, Vol. 2, 2nd ed.,Seminumerical Algorithms.

Addison-Wesley, Reading, MA, 2000.

L’Ecuyer, P., “Random Numbers for Simulation,”Commun. Assoc. Comput. Mach.33, 1990.

Marsaglia, G., “Random Numbers Fall Mainly in the Planes,”Proc. Natl. Acad. Sci. USA 61, 25–28, 1962.

Marsaglia, G., “The Structure of Linear Congruential Sequences,” in Applications of Number Theory to Numerical Analysis, S. K. Zaremba, ed., Academic Press, London, 1972, pp. 249–255.

Naylor, T.,Computer Simulation Techniques. Wiley, New York, 1966.

Ripley, B.,Stochastic Simulation. Wiley, New York, 1986.

von Neumann, J., “Various Techniques Used in Connection with Random Digits, ‘Monte Carlo Method,”’U.S. National Bureau of Standards Applied Mathematics Series, No.

12, 36–38, 1951.

Generating Discrete 4

Random Variables

Một phần của tài liệu Sheldon m ross (eds ) simulation academic press (2012) (Trang 40 - 47)

Tải bản đầy đủ (PDF)

(315 trang)