1. Trang chủ
  2. » Tài Chính - Ngân Hàng

Matematik simulation and monte carlo with applications in finance and mcmc phần 10 potx

34 320 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 34
Dung lượng 885,57 KB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

h > pump:=procbeta0,x,t,alpha,delta,gam,replic,burn localbayeslam,bayeslambda, i,k,n,beta,a1,g1,j,jj,lam,lamval,lambda,lambdaval,s,lam5,tot; # beta0=initial beta value # x[i]=number of f

Trang 1

Sample 500 variates starting with x0 ¼ 4 and a ¼ 0:5 Note the long burn-intime and poor exploration of the state space (slow mixing of states) due to thesmall step length.

Trang 2

Sample 100 000 variates, starting with initial value of 0, to produce a

histogram showing their distribution, together with sample mean and

We are interested in determining survival probabilities for components

The age at failure of the ith component is Xi; i¼ 1; 2; These are i.i.d

Trang 3

Weibull random variables with parameters  and , where the joint priordistribution is

Trang 4

negligible and that the mixing of states is quite reasonable Bayes estimates

are given for survival probabilities at ages 1000, 2000, and 3000 hours Of

course, if interval estimates are required it will be necessary to replicate these

runs It is instructive to see the effect of reducing the sample size (discard

some of the original 43 ages at failure) The posterior density should then

reflect the prior density more so, with the result that the acceptance

probability becomes higher and successive  and  values are less correlated

than previously This is good from the viewpoint of reducing the variance of

# n=number of data items in x;

# q=number of ages at which survivor probabilities are

Trang 5

# Perform k iterations;

for i from 1 to k do;

r1:=evalf(rand()/10^12);r2:=evalf(rand()/10^12);r3:=evalf(rand()/10^12);r4:=evalf(rand()/10^12);

# Sample candidate point (ap,bp) and compute likelihood(L2)for (ap,bp);

# Decide whether to accept or reject candidate point;

if ln(r4)<L2-L1 then a1:=ap; b1:=bp; L1:=L2; end if;

# Enter survivor probs and alpha and beta values intoith row of C;

for j from 1 to q do;

24

35

y1:¼ 1000

y2:¼ 2000

y3:¼ 3000100020003000

24

35

Trang 6

375

264

375

> for i from 1 to k do:

for j from 1 to q+2 do:

Trang 7

> f2:=[seq(f[i,2],i=1 k)]:PLOT(CURVES(f2),TITLE("survivalprobability at 2000 hours against iteration

number:"),AXESSTYLE(NORMAL));

dat1:=[seq(op(2,f1[i]),i=1 k)]:Estimate_Prob_survive_1000_hours=describe[mean](dat1);

0.8 0.75

Trang 8

beta values against iteration number 3200

Trang 9

How do the Bayesian estimates compare with a classical analysis, usingmaximization of the likelihood function, subject to 1 < ? (The parameterspace is restricted to the space used for the prior This ensures that theasymptotic Bayes and likelihood estimates would be identical.) Below, theunrestricted likelihood function, L1, and its contour plot are computed Thishas a global maximum at ^¼ 0:97 Using NLPsolve from the optimizationpackage, the constrained maximum is at ^¼ 1; ^¼ 2201 This represents acomponent with a constant failure rate (exponential life) and an expectedtime to failure of 2201 hours.

Trang 10

[ImportMPS, Interactive, LPSolve, LSSolve, Maximize,

Minimize, NLPSolve, QPSolve]

The data are from D.P Gaver and I.G O’Muircheartaigh (1987), Robust

empirical Bayes analyses of event rates, Technometrics 29, 1–15

The number of failures for pump i in time t½i ði ¼ 1; ; 10Þ; x½i, are assumed

to be i.i.d PoissonðitiÞ where iare i.i.d gamma½;  and  is a realization

from gamma [,gam] The hyperparameters ; , and gam have been

esti-mated as described in the text

Trang 11

The procedure ‘pump’ returns, for each pump i, a list of simulated

i; ½i; i; ð þ xiÞ=ð þ tiÞ values The latter are individual Bayes estimates of i

"

The Gibbs sampling procedure ‘pump’ is as follows

h

> pump:=proc(beta0,x,t,alpha,delta,gam,replic,burn) localbayeslam,bayeslambda,

i,k,n,beta,a1,g1,j,jj,lam,lamval,lambda,lambdaval,s,lam5,tot;

# beta0=initial beta value

# x[i]=number of failures of pump in in time t[i],i=1 10

# alpha, delta, gam are hyperparameters

# replic= number of (equilibrium) observations used

# burn= number of iterations for burn in

# n=number of pumps=10

# lambdaval[k]= a list of lambda values for pump k

# lam[k]= a list of [iteration number, lambda value] forpump k

if j>=1 then lambda[k,j]:=[j,g1];lambdaval[k,j]:=g1;bayeslam[k,j]:=(alpha+x[k])/(beta+t[k]);end if

Trang 12

0 :¼ 0:25replic := 2000burn := 200

Print the Bayes failure rates and plot their posterior and prior densities

Note that the prior becomes unbounded as the failure rate approaches zero,

so the failure rate axis does not extend to zero

Trang 13

end do:

display(A);

‘Bayes estimate failure rate pump’, 1, ‘is’, 0.05809446755

‘Bayes estimate failure rate pump’, 2, ‘is’, 0.09192159100

‘Bayes estimate failure rate pump’, 3, ‘is’, 0.08666946370

‘Bayes estimate failure rate pump’, 4, ‘is’, 0.1146666174

‘Bayes estimate failure rate pump’, 5, ‘is’, 0.5657757990

‘Bayes estimate failure rate pump’, 6, ‘is’, 0.6016311355

‘Bayes estimate failure rate pump’, 7, ‘is’, 0.7643500150

‘Bayes estimate failure rate pump’, 8, ‘is’, 0.7643500150

‘Bayes estimate failure rate pump’, 9, ‘is’, 1.470380706

‘Bayes estimate failure rate pump’, 10, ‘is’, 1.958497332

.2 1

Trang 14

Let n1;0; n0;1; n1;1 be the number of animals captured in the first only,

second only, and both episodes respectively Then the number of distinct

animals captured is nprime¼ n1;0þ n0;1þ n1;1and the total number of animals

captured in both episodes is ncap¼ n1;0þ n0;1þ 2n1;1 Let p¼ probability that

an animal has of being capured on an episode The prior distribution of p is

assumed to be U(0,1)

Let notcap¼ N-nprime This is the number of animals in the population

that were not captured in either of the episodes We will use Gibbs sampling to

estimate the posterior distributions of notcap and p and the Bayes estimate of

# burn=number of iterations for burn in

# iter=number of (equilibrium) iterations used

Trang 15

l1:=[seq(op(2,pa[i]),i=1 500)]:# a list of p-valuesl2:=[seq(op(2,notcapa[i]),i=1 500)]:# a list of notcapvalues

l3:=[seq((1-l1[i])^2,i=1 500)]:# a list of (1-p)^2values

histogram(l1,title="distribution of sampled pvalues",labels=["p","density"]);

Mean=evalf(describe[mean](l1));

StdDev=evalf(describe[standarddeviation](l1));

histogram(l2,title="distribution of samplednotcapvalues",labels=["notcap","density"]);

Mean=evalf(describe[mean](l2));

StdDev=evalf(describe[standarddeviation](l2));

Bayes_estimate_of_number_not_captured=250*evalf(describe[mean](l3));

p-value against iteration number 0.4

0.36 0.32 0.28 p

Trang 16

distribution of sampled p values

density

1612840

p

Mean= 0.3277568452StdDev= 0.02987484843

distribution of sampled notcapvalues 0.03

0.025 0.02 0.015 0.01 0.005 0

notcapdensity

Mean= 114.5420000StdDev= 14.69966789Bayes_estimate_of_number_not_captured= 113.2008414

Here, slice sampling is explored as applied to the generation of variates from a

truncated gamma distribution with density proportional to x1 exon

sup-port½a; 1Þ where a > 0

Trang 17

> restart:with(stats):with(stats[statplots]):

h

> truncatedgamma:=proc(alpha,a,x0,iter,burn)localx,r1,r2,r3,u1,u2,u3,lower,upper,pa,i,ii,tot;

Mean:=evalf(describe[mean](l1));StdDev:=evalf(describe[standarddeviation[1]](l1));

burn :¼ 100iter :¼ 500

 :¼ 3

a :¼ 9x0 :¼ 10

Trang 18

truncated gamma variate against iteration number 14

distribution of sampled x values 0.8

Mean :¼ 10:16067140StdDev :¼ 1:147220533

In order to obtain an estimated standard error for the mean of the truncated

gamma, the sample standard deviation is calculated of 100 independent

replications (different seeds and starting states), each comprising 200

obser-vations after a burn-in of 100 obserobser-vations In the code below the seed is

changed for each new replication It is not sufficient merely to use the last

random number from one replication as the seed for the next replication

This would induce positive correlation between the two sample means

Trang 19

for j from 1 to replic do:

seed:=seed+113751:

randomize(seed):

x0:=a-ln(evalf(rand()/10^12))/1.04; #an initial state is

a random observation from an approximation to thetruncated gamma  estimated from the histogram abovepa:=truncatedgamma(alpha,a,x0,iter,burn):

time1:=time()-t1;

t1 :¼ 3:266replic :¼ 100seed :¼ 6318540iter :¼ 200

 :¼ 3

a :¼ 9burn:=100Mean :¼ 10:205022061Std error1 :¼ 0:0178901125time1 :¼ 7:077

Trang 20

 :¼ 3

a :¼ 9iter :¼ 100Mean :¼ 10:27322534Std error2 :¼ 0:1155931003time2 :¼ 5:750

100 independent truncated gamma variates

Iteration number

variate

13 12 11 10 9

For comparison of mixing 100 successive truncated variates from Gibbs

sampling are displayed below

Trang 21

Mean:=evalf(describe[mean](l1)):StdDev:=evalf(describe[standarddeviation[1]](l1)):

burn :¼ 100iter :¼ 100

 :¼ 3

a :¼ 9x0 :¼ 10

truncated gamma variate against iteration number 12.5

12 11.5 11 10.5 10 9.5 9

Iteration number variate

Trang 23

 :¼ 0:54

 :¼ 1:11gam :¼ 2:20

0 :¼ 0:25replic :¼ 2000burn :¼ 200

xa[6]:="lambda[6]":xa[7]:="lambda[7]":xa[8]:="lambda[8]":xa[9]:="lambda[9]":xa[10]:="lambda[10]":

A:=array(1 5,1 2):

for i from 1 to 5 do:

A[i,1]:=display({

statplots[histogram](v2[2*i-1]),plot(u/k,lambda=ms[2*i-1]*0.9 m[2*i-1]*1.1,labels=[xa[2*i-1],"prior and posteriordensities"],labeldirections=[HORIZONTAL,VERTICAL])},tickmarks=[4,2]):

A[i,2]:=display({

statplots[histogram](v2[2*i]),plot(u/k,lambda=ms[2*i]*0.9 m[2*i]*1.1,labels=[xa[2*i],"prior and posterior

densities"],labeldirections=[HORIZONTAL,VERTICAL])},tickmarks=[4,2]):

end do:

display(A);

‘Bayes estimate failure rate pump’, 1, ‘is’, 0.05815043965

‘Bayes estimate failure rate pump’, 2, ‘is’, 0.09241283960

‘Bayes estimate failure rate pump’, 3, ‘is’, 0.08679369040

‘Bayes estimate failure rate pump’, 4, ‘is’, 0.1147498208

‘Bayes estimate failure rate pump’, 5, ‘is’, 0.5734949755

Trang 24

‘Bayes estimate failure rate pump’, 6, ‘is’, 0.6033146695

‘Bayes estimate failure rate pump’, 7, ‘is’, 0.7929892845

‘Bayes estimate failure rate pump’, 8, ‘is’, 0.7929892845

‘Bayes estimate failure rate pump’, 9, ‘is’, 1.508423622

‘Bayes estimate failure rate pump’, 10, ‘is’, 1.973528342

.1e2

0.

.3 2 1

.2e2

0.

.6 4 2

.1e2

0.

.2 1

prior and posterior densities

Trang 26

Ahrens, J.H and Dieter, U (1980) Sampling from binomial and Poisson distributions: a

method with bounded computation times Computing, 25: 193–208.

Allen, L.J.S (2003) An Introduction to Stochastic Processes with Applications to Biology.

London: Pearson

Anderson, S.L (1990) Random number generators on vector supercomputers and other

advanced architectures SIAM Review, 32: 221–251.

Atkinson, A.C (1979) The computer generation of Poisson random variables Applied

Statistics, 28: 29–35.

Atkinson, A.C (1982) The simulation of generalised inverse Gaussian and hyperbolic random

variables SIAM Journal of Scientific and Statistical Computing, 3: 502–517.

Banks, J., Carson, J.S., Nelson, B.L and Nicol, D.M (2005) Discrete Event System Simulation,

4th edn Upper Saddle River, New Jersey: Prentice Hall

Barndorff-Nielson, O (1977) Exponentially decreasing distributions for the logarithm of

particle size Proceedings of the Royal Society, A353: 401–419.

Beasley, J.D and Springer, S.G (1977) The percentage points of the normal distribution

Applied Statistics, 26: 118–121.

Besag, J (1974) Spatial interaction and the statistical analysis of lattice systems (with

discussion) Journal of the Royal Statistical Society, Series B, 36: 192–326.

Black, F and Scholes, M (1973) The pricing of options and corporate liabilities Journal of

Political Economy, 81: 637–654.

Borosh, I and Niederreiter, H (1983) Optimal multipliers for pseudo-random number

generation by the linear congruential method BIT, 23: 65–74.

Butcher, J.C (1961) Random sampling from the normal distribution Computer Journal, 3:

Cheng, R.C.H (1978) Generating beta variates with nonintegral shape parameters

Communications of the Association of Computing Machinery, 21: 317–322.

Cochran, W.G (1977) Sampling Techniques Chichester: John Wiley & Sons, Ltd.

Cox, D.R and Miller, H.D (1965) The Theory of Stochastic Processes London: Methuen Dagpunar, J.S (1978) Sampling of variates from a truncated gamma distribution Journal of

Statistical Computation and Simulation, 8: 59–64.

Dagpunar, J.S (1988a) Principles of Random Variate Generation Oxford: Oxford University

Press

Dagpunar, J.S (1988b) Computer generation of random variates from the tail of t and normal

distributions Communications in Statistics, Simulation and Computing, 17: 653–661 Dagpunar, J.S (1989a) A compact and portable Poisson random variate generator Journal of

Applied Statistics, 16: 391–393.

© 2007 John Wiley & Sons, Ltd

Trang 27

326 References

Dagpunar, J.S (1989b) An easily implemented generalised inverse Gaussian generator

Communications in Statistics, Simulation and Computing, 18: 703–710.

Devroye, L (1986) Non-uniform Random Variate Generation Berlin: Springer-Verlag.

Devroye, L (1996) Random variate generation in one line of code In J.M Charnes,

D.J Morrice, D.T Brunner and J.J Swain (eds), Proceedings of the 1996 Winter

Simulation Conference, pp 265–272 ACM Press

Eberlein, E (2001) Application of generalized hyperbolic Lévy motions to finance In

O.E Barndorff-Nielson, T Mikosch and S Resnick (eds), Lévy Processes: Theory and

Fishman, G.S (1979) Sampling from the binomial distribution on a computer Journal of the

American Statistical Association, 74: 418–423.

Fishman, G.S and Moore, L.R (1986) An exhaustive analysis of multiplicative congruentialrandom number generators with modulus 2ˆ31–1 SIAM Journal on Scientific and

Statistical Computing, 7: 24–45.

Fouque, J.-P and Tullie, T.A (2002) Variance reduction for Monte Carlo simulation in a

stochastic volatility environment Quantitative Finance, 2: 24–30.

Gamerman, D (1997) Markov Chain Monte Carlo: Stochastic Simulation for Bayesian

Inference London: Chapman and Hall.

Gaver, D.P and O’Muircheartaigh, I.G (1987) Robust empirical Bayes analyses of event

rates Technometrics, 29: 1–15.

Gelfand, A.E and Smith, A.F.M (1990) Sampling-based approaches to calculating marginal

densities Journal of the American Statistical Association, 85: 398–409.

Gentle, J.E (2003) Random Number Generation and Monte Carlo Methods, 2nd edn Berlin:

Gilks, W.R., Best, N.G and Tan, K.K.C (1995) Adaptive rejection Metropolis sampling

within Gibbs sampling Applied Statistics, 44: 455–472.

Gilks, W.R., Richardson, S., Spiegelhalter, D.J (1996) Markov Chain Monte Carlo in Practice.

London: Chapman and Hall

Glasserman, P (2004) Monte Carlo methods in Financial Engineering Berlin: Springer.

Glasserman, P., Heidelberger, P., Shahabuddin, P (1999) Asymptotically optimal importance

sampling and stratification for pricing path-dependent options Mathematical Finance, 2:

117–152

Hammersley, J.M and Handscombe, D.C (1964) Monte Carlo Methods London: Methuen.

Hastings, W.K (1970) Monte Carlo sampling methods using Markov chains and their

applications Biometrika, 57: 97–109.

Hobson, D.G (1998) Stochastic volatility In D.J Hand and S.D Jacka (eds), Statistics and

Finance, Applications of Statistics Series, pp 283–306 London: Arnold.

Hull, J.C (2006) Options, Futures and Other Derivatives, 6th edn Upper Saddle River, New

Jersey: Prentice Hall

Hull, T.E and Dobell, A.R (1962) Random number generators SIAM Review, 4: 230–254.

Jöhnk, M.D (1964) Erzeugung von Betaverteilten und Gammaverteilten Zufallszahlen

Metrika, 8: 5–15.

Johnson, N.L., Kotz, S and Balakrishnan, N (1995) Continuous Univariate Distributions,

Volume 2, 2nd edn Chichester: John Wiley & Sons, Ltd

Ngày đăng: 09/08/2014, 16:21

🧩 Sản phẩm bạn có thể quan tâm