Means and Variances of Linear Combinations of Random Variables

Một phần của tài liệu Mathematical statistics for economics and business (second edition) part 1 (Trang 186 - 191)

10. Hypothesis Testing Methods and Confidence Regions 609

3.11 Means and Variances of Linear Combinations of Random Variables

Determining the mean and variance of random variables that are defined as linear combinations of other random variables is a problem that often arises in practice. While this determination can be accomplished from first principles by applying the basic definitions of mean and variance to the linear function of random variables, there are certain general results that facilitate and expedite the process. In particular, we will see that the mean and variance of a linear combination of random variables can be expressed as simple functions of the means, variances, and covariances of the random variables involved in the linear combination. Our first result concerns the determination of the mean.

Theorem 3.38 Mean of a Linear Combination

Let YẳPn

iẳ1aiXiẳa0Xwhere the ai0s are real constants. ThenE(Y)ẳPn

iẳ1

aiE Xð ịẳi

a0EðXị,where

aẳ a1

...

an

2 64

3

75andXẳ X1

...

Xn

2 64

3 75:

Proof

EðYị ẳE Xn

iẳ1

aiXi

!

ẳXn

iẳ1

EðaiXiị ẳXn

iẳ1

aiEð ịXi (Theorem 3.9 and 3:6ị n Regarding the variance of the linear combination of random variables, we have the following result.

Theorem 3.39 Variance of a Linear Combination

Let Y ẳ Pn

iẳ1aiXi where the ai0s are real constants. Then s2Y ẳ Pn

iẳ1

a2is2Xiþ 2P P

i<j

aiajsXiXj ẳa0CovðXịa , where CovðXị ẳEðXEðXịịðXEðXịị0 is thecovariance matrixofX.

Proof

s2Y ẳEðYEðYịị2

ẳE Xn

iẳ1

aiðXiEð ịXi ị

!2

ẳE Xn

iẳ1

a2iðXiEð ịXi ị2ỵ2X X

i<j

aiaiðXiEð ịXi ị XjE Xj

" #

ẳXn

iẳ1

a2is2Xiþ2X X

i<j

aiaisXiXj ẳa0EðXEðXịịðXEðXịị0 a

where the penultimate equality follows from Theorems 3.9 and 3.6. n We formally define the notion ofcovariance matrixbelow and further moti- vate its content and meaning.

Definition 3.26

Covariance Matrix The covariance matrix of an n-variate random variable X is the (n n) symmetric matrix whoseð ịi;j thelement is the covariance betweenXiandXj, defined asCovðXị ẳEðXEðXịịðXEðXịị0

.

Note that because the covariance betweenXi and Xi is, by definition, the variance of Xi, the covariance matrix has the variances of the Xi0s along its diagonal. In order to appreciate the full informational content of the covariance matrix, note that, by definition,

158 Chapter 3 Expectations and Moments of Random Variables

Covnn ð ịðXị ẳE

X1Eð ịX1

...

XnEð ịXn

2 66 66 4

3 77 77

5ẵðX1Eð ịX1ị ðXnEð ịXnị

ẳE

X1Eð ịX1

ð ị2 ðX1Eð ịX1ịðX2Eð ịX2ị ðX1Eð ịX1ịðXnEð ịXnị

X2Eð ịX2

ð ịðX1Eð ịX1ị ðX2Eð ịX2ị2 ...

... ... ...

XnEð ịXn

ð ịðX1Eð ịX1ị ðXnEð ịXn ị2 2

66 66 66 66 64

3 77 77 77 77 75

s2X1 sX1X2 sX1Xn

sX2X1 s2X2 ...

... ... ...

sXnX1 s2Xn

2 66 66 66 66 64

3 77 77 77 77 75 :

Thus, the covariance matrix has the variance of the ith random variable displayed in the (i,i)th (diagonal entry) position in the matrix, and the covariance between theith and jth random variables displayed in the (i,j)th position (off- diagonal entry) in the matrix. Since sXiXj ẳsXjXithe covariance matrix issym- metric, i.e., the (i,j)th entry is exactly equal to the (j,i)th entry8i6ẳj.

Note that it is necessarily the case that the covariance matrix is apositive semidefinite matrixbecauses2ẳa0CovðXịa0; 8a, which necessarily follows from the fact that variances cannot be negative. (Recall that a matrix Z is positive semidefiniteiffa0Za0;8a).

The preceding results can be extended to the case whereYis a vector defined by linear combinations of then-variate random variableX. We first extend the results corresponding to the mean ofY.

Theorem 3.40 Mean of a Vector of Linear Combinations (Pre Multiplication)

LetYẳAXwhereAis a kn constants,andXis an n1vector of random variables. ThenEðYị ẳEðAXị ẳAEðXị.

Proof This follows straightforwardly from Theorem 3.38 and the fact that an expecta-

tion of a vector is the vector of expectations. n

A useful corollary to Theorem 3.40 concerns the generalization whereXis a np matrixof random variables.

Corollary 3.6Mean of a Matrix of Linear Combinations (Pre Multiplication)

LetYẳAXwhereAis a kn matrix of real constants,andXis an np matrix of random variables. ThenEðYị ẳEðAXị ẳAEðXị.

Proof This follows directly from Theorem 3.40 applied columnwise to the matrixAX.n If wepostmultiplyrather thanpremultiplya random matrixXby a conform- able matrix of constants, we obtain a result on expectation qualitatively similar to the preceding result.

Corollary 3.7Mean of a Matrix of Linear Combinations (Post Multiplication)

Let YẳXB,where Xis a np matrix of random variables andB is a pm matrix of real constants. ThenEðYị ẳEðXBị ẳEðXịB.

Proof

EðXBị ẳEðB0X0ị0ðproperty of matrix transposeị

ẳðEðB0X0ịị0ðexpectation is an elementwise operatorị

ẳðB0Eð ịX0 ị0 ðCorollary 3:6ị

ẳEðXịB ðproperty of matrix transposeị n

If a random matrixXisbothpremultiplied and postmultiplied by conform- able matrices of real constants, then the previous two corollaries can be com- bined into the following result:

Corollary 3.8Mean of a Matrix of Linear Combinations (Pre and Post Multiplication)

Let Abe a kn matrix of real constants,let Xbe a np matrix of random variables,and letBbe a pm matrix of real constants. ThenEðAXBị ẳAEðXịB.

Proof LetZẳXB. Then by Corollary 3.6, EðAXBị ẳEðAZị ẳAEðZị ẳAEðXBị, which

equalsAEðXịBby Corollary 3.7. n

WhenYẳAXis a vector of two or more random variables, we can define a variance for eachYi, as well as a covariance for each pair Yi;Yj

. We are led to a generalization of Theorem 3.39 that involves the definition of the covariance matrixof the (k1) random vectorYẳAX.

Theorem 3.41 Covariance Matrix of Linear Combination

LetYẳAX whereAis a kn matrix of real constants andXis a n1vector of random variables. ThenCovðYị ẳCov AXð ị ẳACovðXịA0.

Proof By definition,

CovðYị ẳEðYEðYịịðYEðYịị0

ẳEA Xð EðXịịðXEðXịị0A0

ðsubstitution and Theorem 3:40ị

ẳAEðXEðXịịðXEðXịị0

A0 ðCorollary 3:8ị

ẳACovðXịA0 ðby definitionị n

160 Chapter 3 Expectations and Moments of Random Variables

We illustrate the use of some of the above theorems in the following example, where we also introduce the notion of acorrelation matrix(see Exam- ple 3.28 part (g)).

Example 3.28 Calculating Means, Covariances,and Correlations of Linear Combinations

Your company sells two brands of blank recordable DVDs: Blueray (BR) and standard (S). The price of a package of BR disks is $4 while the standard disks sell for $3 a package. The quantities of the disk packages sold on any given day are represented by the bivariate random variableQ ẳ(QBR,QS), where

EðQị ẳ 10 30

andCovðQị ẳ 2 3 3 5

:

a. What is the expected value of the revenue obtained from the sale of DVDs on any given day?

Answer: Revenue (in dollars) is defined as Rẳ4QBRỵ3QSẳ ẵ4 3 QBR

QS

; and Theorem 3.39 applies. Therefore, EðRị ẳ ẵ4 3 10

30

" #

ẳ130:

b. What is the variance associated with daily revenue?

Answer: Theorem 3.39 applies here. We have that s2Rẳ ẵ4 3 2 3

3 5

4 3 ẳ5:

c. Production costs per disk package are $2.50 and $2 for the Blueray and standard DVDs, respectively. Define the expected value of the vector

R

C , whereCẳ2.50QBR+ 2QSrepresents total cost of DVDs sold on any given day.

Answer: Theorem 3.40 can be used here (we could also apply Theorem 3.38 to obtain E(C), since we already know E(R)from above).

E R C

" #

ẳ 4 3 2:5 2

10

30

" #

ẳ 130 85

" #

d. What is the covariance matrix of R C ? Answer: Using Theorem 3.41,

Cov R C

ẳ 4 3 2:5 2

2 3

3 5

4 2:5

3 2

ẳ 5 3:5 3:5 2:5

: e. What is the expected level of profit on any given day?

Answer: Profit is defined asP ẳRC, and Theorem 3.38 implies that Eð ị ẳP ẵ1 1 130

85

ẳ45:

f. What is the variance of daily profit?

Answer: Applying Theorem 3.39 results in s2Pẳ ẵ1 1 5 3:5

3:5 2:5

1 1

ẳ:5

g. A matrix of correlations (or, correlation matrix) for XẳðX1; :::;Xnịcan be defined by pre- and post-multiplication of the covariance matrix by the inverse of the diagonal matrix of standard deviations, i.e., (reader please verify):

CovðXị ẳ sX1

.. . sXn

2 64

3 75

1

CovðXị sX1

.. . sXn

2 64

3 75

1

The (i,j)th entry of the correlation matrix is the correlation betweenXiand Xj. Define the correlation matrix forQ.

Answer:

CorrðQị ẳ ffiffiffi2

p 0

0 ffiffiffi p5

" #1

2 3

3 5

" # ffiffiffi p2

0 0 ffiffiffi p5

" #1

ẳ 1 :949 :949 1

Note rQBR;QS ẳ .949, which is given by the off-diagonal elements in this (2 2) case, while the diagonal elements are ones because these values represent the correlation of a random variable with itself. □

Một phần của tài liệu Mathematical statistics for economics and business (second edition) part 1 (Trang 186 - 191)

Tải bản đầy đủ (PDF)

(388 trang)