1. Trang chủ
  2. » Kinh Doanh - Tiếp Thị

Introduction to Probability phần 7 docx

51 268 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 51
Dung lượng 524,29 KB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

, Xnbe a sequence of independent random variables, all having a common density function fX with support [a, b] see Exercise 19.. , Xnbe a sequence of independent random variables, all ha

Trang 1

5 10 15 200.025

60 rolls per experiment

Figure 7.5: Rolling a fair die

Trang 2

1 2 3 4 5 6 7 8 0

vari-fSn(x) = (fX1∗ fX2∗ · · · ∗ fXn) (x) ,where the right-hand side is an n-fold convolution It is possible to calculate thisdensity for general values of n in certain simple cases

Example 7.9 Suppose the Xiare uniformly distributed on the interval [0, 1] Then

fXi(x) =



1, if 0 ≤ x ≤ 1,

0, otherwise,and fSn(x) is given by the formula4

The density fSn(x) for n = 2, 4, 6, 8, 10 is shown in Figure 7.6

If the Xiare distributed normally, with mean 0 and variance 1, then (cf ple 7.5)

Trang 3

-15 -10 -5 5 10 15

0.025 0.05 0.075 0.1 0.125 0.15

If the Xi are all exponentially distributed, with mean 1/λ, then

fXi(x) = λe−λx ,and

fSn(x) = λe

−λx(λx)n−1(n − 1)! .

In this case the density fSn for n = 2, 4, 6, 8, 10 is shown in Figure 7.8 2

Exercises

1 Let X and Y be independent real-valued random variables with density tions fX(x) and fY(y), respectively Show that the density function of thesum X + Y is the convolution of the functions fX(x) and fY(y) Hint : Let ¯X

func-be the joint random variable (X, Y ) Then the joint density function of ¯X is

fX(x)fY(y), since X and Y are independent Now compute the probabilitythat X + Y ≤ z, by integrating the joint density function over the appropriateregion in the plane This gives the cumulative distribution function of Z Nowdifferentiate this function with respect to z to obtain the density function ofz

2 Let X and Y be independent random variables defined on the space Ω, withdensity functions fX and fY, respectively Suppose that Z = X + Y Findthe density f of Z if

Trang 4

5 10 15 20 0.05

0, otherwise

(b)

fX(x) = fY(x) =

1/2, if 3 ≤ x ≤ 5,

0, otherwise

(c)

fX(x) =

1/2, if −1 ≤ x ≤ 1,

0, otherwise

fY(x) =

1/2, if 3 ≤ x ≤ 5,

0, otherwise

(d) What can you say about the set E = { z : fZ(z) > 0 } in each case?

3 Suppose again that Z = X + Y Find fZ if

(a)

fX(x) = fY(x) =

x/2, if 0 < x < 2,

0, otherwise

(b)

fX(x) = fY(x) =

(1/2)(x − 3), if 3 < x < 5,

(c)

fX(x) =

1/2, if 0 < x < 2,

0, otherwise,

Trang 5

fY(x) =

x/2, if 0 < x < 2,

0, otherwise

(d) What can you say about the set E = { z : fZ(z) > 0 } in each case?

4 Let X, Y , and Z be independent random variables with

0, otherwise

9 Assume that the service time for a customer at a bank is exponentially tributed with mean service time 2 minutes Let X be the total service timefor 10 customers Estimate the probability that X > 22 minutes

Trang 6

dis-10 Let X1, X2, , Xn be n independent random variables each of which has

an exponential density with mean µ Let M be the minimum value of the

Xj Show that the density for M is exponential with mean µ/n Hint : Usecumulative distribution functions

11 A company buys 100 lightbulbs, each of which has an exponential lifetime of

1000 hours What is the expected time for the first of these bulbs to burnout? (See Exercise 10.)

12 An insurance company assumes that the time between claims from each of itshomeowners’ policies is exponentially distributed with mean µ It would like

to estimate µ by averaging the times for a number of policies, but this is notvery practical since the time between claims is about 30 years At Galambos’5

suggestion the company puts its customers in groups of 50 and observes thetime of the first claim within each group Show that this provides a practicalway to estimate the value of µ

13 Particles are subject to collisions that cause them to split into two parts witheach part a fraction of the parent Suppose that this fraction is uniformlydistributed between 0 and 1 Following a single particle through several split-tings we obtain a fraction of the original particle Zn= X1· X2· · Xn whereeach Xj is uniformly distributed between 0 and 1 Show that the density forthe random variable Zn is

14 Assume that X1 and X2 are independent random variables, each having anexponential density with parameter λ Show that Z = X1− X2has density

Then for a fair coin Z has approximately a chi-squared distribution with

2 − 1 = 1 degree of freedom Verify this by computer simulation first for afair coin (p = 1/2) and then for a biased coin (p = 1/3)

5 J Galambos, Introductory Probability Theory (New York: Marcel Dekker, 1984), p 159.

Trang 7

16 Verify your answers in Exercise 2(a) by computer simulation: Choose X and

Y from [−1, 1] with uniform density and calculate Z = X + Y Repeat thisexperiment 500 times, recording the outcomes in a bar graph on [−2, 2] with

40 bars Does the density fZ calculated in Exercise 2(a) describe the shape

of your bar graph? Try this for Exercises 2(b) and Exercise 2(c), too

17 Verify your answers to Exercise 3 by computer simulation

18 Verify your answer to Exercise 4 by computer simulation

19 The support of a function f (x) is defined to be the set

{x : f (x) > 0} Suppose that X and Y are two continuous random variables with densityfunctions fX(x) and fY(y), respectively, and suppose that the supports ofthese density functions are the intervals [a, b] and [c, d], respectively Find thesupport of the density function of the random variable X + Y

20 Let X1, X2, , Xnbe a sequence of independent random variables, all having

a common density function fX with support [a, b] (see Exercise 19) Let

Sn= X1+ X2+ · · · + Xn, with density function fSn Show that the support

of fSn is the interval [na, nb] Hint : Write fSn = fSn−1 ∗ fX Now useExercise 19 to establish the desired result by induction

21 Let X1, X2, , Xnbe a sequence of independent random variables, all having

a common density function fX Let A = Sn/n be their average Find fA if(a) fX(x) = (1/√

2π)e−x2/2 (normal density)

(b) fX(x) = e−x (exponential density)

Hint : Write fA(x) in terms of fSn(x)

Trang 8

Law of Large Numbers

Variables

We are now in a position to prove our first fundamental theorem of probability

We have seen that an intuitive way to view the probability of a certain outcome

is as the frequency with which that outcome occurs in the long run, when the periment is repeated a large number of times We have also defined probabilitymathematically as a value of a distribution function for the random variable rep-resenting the experiment The Law of Large Numbers, which is a theorem provedabout the mathematical model of probability, shows that this model is consistentwith the frequency interpretation of probability This theorem is sometimes calledthe law of averages To find out what would happen if this law were not true, seethe article by Robert M Coates.1

Proof Let m(x) denote the distribution function of X Then the probability that

X differs from µ by at least  is given by

Trang 9

since all the summands are positive and we have restricted the range of summation

in the second sum But this last sum is at least

Example 8.1 Let X by any random variable with E(X) = µ and V (X) = σ2.Then, if  = kσ, Chebyshev’s Inequality states that

P (|X − µ| ≥ kσ) ≤ σ

2

k2σ2 = 1

k2 Thus, for any random variable, the probability of a deviation from the mean ofmore than k standard deviations is ≤ 1/k2 If, for example, k = 5, 1/k2= 04 2Chebyshev’s Inequality is the best possible inequality in the sense that, for any

 > 0, it is possible to give an example of a random variable for which Chebyshev’sInequality is in fact an equality To see this, given  > 0, choose X with distribution

pX= − +

1/2 1/2



Then E(X) = 0, V (X) = 2, and

P (|X − µ| ≥ ) = V (X)

2 = 1

We are now prepared to state and prove the Law of Large Numbers

Trang 10

Law of Large Numbers

Theorem 8.2 (Law of Large Numbers) Let X1, X2, , Xnbe an independenttrials process, with finite expected value µ = E(Xj) and finite variance σ2= V (Xj).Let Sn= X1+ X2+ · · · + Xn Then for any  > 0,

P



Sn

n − µ

≥ 

Sn

n − µ

< 

V (Sn

n ) =

σ2

n .Also we know that

Sn

n − µ

≥ 



≤ σ

2

n2 Thus, for fixed ,

P



Sn

n − p

≥ 

Trang 16

8 A fair coin is tossed a large number of times Does the Law of Large Numbersassure us that, if n is large enough, with probability > 99 the number ofheads that turn up will not deviate from n/2 by more than 100?

9 In Exercise 6.2.15, you showed that, for the hat check problem, the number

Sn of people who get their own hats back has E(Sn) = V (Sn) = 1 UsingChebyshev’s Inequality, show that P (Sn≥ 11) ≤ 01 for any n ≥ 11

10 Let X by any random variable which takes on values 0, 1, 2, , n and hasE(X) = V (X) = 1 Show that, for any positive integer k,

P (X ≥ k + 1) ≤ 1

k2

11 We have two coins: one is a fair coin and the other is a coin that producesheads with probability 3/4 One of the two coins is picked at random, and thiscoin is tossed n times Let Sn be the number of heads that turns up in these

n tosses Does the Law of Large Numbers allow us to predict the proportion

of heads that will turn up in the long run? After we have observed a largenumber of tosses, can we tell which coin was chosen? How many tosses suffice

to make us 95 percent sure?

12 (Chebyshev8) Assume that X1, X2, , Xnare independent random variableswith possibly different distributions and let Snbe their sum Let mk = E(Xk),

σ2 = V (Xk), and Mn= m1+ m2+ · · · + mn Assume that σ2< R for all k.Prove that, for any  > 0,

P



Sn

n −Mnn

< 

*14 Prove the following analogue of Chebyshev’s Inequality:

P (|X − E(X)| ≥ ) ≤ 1

E(|X − E(X)|)

8 P L Chebyshev, “On Mean Values,” J Math Pure Appl., vol 12 (1867), pp 177–184.

Trang 17

*15 We have proved a theorem often called the “Weak Law of Large Numbers.”Most people’s intuition and our computer simulations suggest that, if we toss

a coin a sequence of times, the proportion of heads will really approach 1/2;that is, if Sn is the number of heads in n times, then we will have

An= Sn

n → 12

as n → ∞ Of course, we cannot be sure of this since we are not able to tossthe coin an infinite number of times, and, if we could, the coin could come upheads every time However, the “Strong Law of Large Numbers,” proved inmore advanced courses, states that

P Sn

n → 12

.Could we assign the equiprobable measure to this space? (See Example 2.18.)

*16 In this exercise, we shall construct an example of a sequence of random ables that satisfies the weak law of large numbers, but not the strong law.The distribution of Xiwill have to depend on i, because otherwise both lawswould be satisfied (This problem was communicated to us by David Maslen.)Suppose we have an infinite sequence of mutually independent events A1, A2, Let ai= P (Ai), and let r be a positive integer

vari-(a) Find an expression of the probability that none of the Ai with i > roccur

(b) Use the fact that x − 1 ≤ e−x to show that

P (No Ai with i > r occurs) ≤ e−

P∞ i=r a i

(c) (The first Borel-Cantelli lemma) Prove that ifP∞

i=1ai diverges, then

P (infinitely many Ai occur) = 1

Now, let Xi be a sequence of mutually independent random variablessuch that for each positive integer i ≥ 2,

P (Xi= i) = 1

2i log i, P (Xi= −i) =

12i log i, P (Xi = 0) = 1 −

1

i log i.When i = 1 we let Xi = 0 with probability 1 As usual we let Sn =

X + · · · + X Note that the mean of each X is 0

Trang 18

(d) Find the variance of Sn.

(e) Show that the sequence hXii satisfies the Weak Law of Large Numbers,i.e prove that for any  > 0

P



Snn

≥ 

Sn−1

n − 1 ,

we know that Xn/n → 0 From the definition of limits, we conclude thatthe inequality |Xi| ≥ 1

2i can only be true for finitely many i

(f) Let Ai be the event |Xi| ≥ 1

2i Find P (Ai) Show that P∞

i=1P (Ai)diverges (use the Integral Test)

(g) Prove that Ai occurs for infinitely many i

*17 Let us toss a biased coin that comes up heads with probability p and assumethe validity of the Strong Law of Large Numbers as described in Exercise 15.Then, with probability 1,

Trang 19

8.2 Law of Large Numbers for Continuous

Ran-dom Variables

In the previous section we discussed in some detail the Law of Large Numbers fordiscrete probability distributions This law has a natural analogue for continuousprobability distributions, which we consider somewhat more briefly here

Chebyshev Inequality

Just as in the discrete case, we begin our discussion with the Chebyshev Inequality.Theorem 8.3 (Chebyshev Inequality) Let X be a continuous random variablewith density function f (x) Suppose X has a finite expected value µ = E(X) andfinite variance σ2= V (X) Then for any positive number  > 0 we have

P (|X − µ| ≥ ) ≤ σ

2

2

2The proof is completely analogous to the proof in the discrete case, and we omitit

Note that this theorem says nothing if σ2= V (X) is infinite

Example 8.4 Let X be any continuous random variable with E(X) = µ and

V (X) = σ2 Then, if  = kσ = k standard deviations for some integer k, then

P (|X − µ| ≥ kσ) ≤ σ

2

k2σ2 = 1

k2 ,

Law of Large Numbers

With the Chebyshev Inequality we can now state and prove the Law of LargeNumbers for the continuous case

Theorem 8.4 (Law of Large Numbers) Let X1, X2, , Xnbe an independenttrials process with a continuous density function f , finite expected value µ, and finitevariance σ2 Let Sn = X1+ X2+ · · · + Xn be the sum of the Xi Then for any realnumber  > 0 we have

lim

n→∞P

 ... so to speak, a certain fate

I now know whether Plato wished to aim at this in his doctrine ofthe universal return of things, according to which he predicted that allthings will return to. .. Values,” J Math Pure Appl., vol 12 (18 67) , pp 177 –184.

Trang 17< /span>

*15 We have proved a theorem... finally, this one thing seems to follow: that if observations ofall events were to be continued throughout all eternity, (and hence theultimate probability would tend toward perfect certainty),

Ngày đăng: 09/08/2014, 23:20

TỪ KHÓA LIÊN QUAN