Sometimes we are given some information about one of the random variables and must estimate the distribution of the other... 754 RANDOM VARIABLES AND PROCESSES H.2 where r is a paramete
Trang 1750 MULTILAYER THIN-FILM FILTERS
The three-cavity filter is described by the sequence
G ( H L ) 5 H L L ( H L ) l l H L L ( H L ) l l H L L ( H L ) S H G
Again, the values nG = 1.52, nL = 1.46, and nH 2.3 were used
References
[Kni76] Z Knittl Optics of Thin Films John Wiley, New York, 1976
[RWv93] S Ramo, J R Whinnery, and T van Duzer Fields and Waves in Communication
Trang 2Processes
H.1
I N model noise, polarization, and network traffic Understanding the statistical nature M A N Y P L A C E S in the book, we use random variables and random processes to
of these parameters is essential in predicting the performance of communication systems
Random Variables
A random variable X is characterized by a probability distribution function
F x ( x ) = P { X <_ x}
The derivative of F x ( x ) is the probability density function
d F x ( x )
f x ( x ) = ~
d x
Note that
f_ ~ f x ( x ) d x - 1
(X)
In many cases, we will be interested in obtaining the expectation, or ensemble aver- age, associated with this probability function The expectation of a function g ( x ) is defined as
E [ g ( X ) ] = f x ( x ) g ( x ) d x
o o
751
Trang 3752 RANDOM VARIABLES AND PROCESSES
H.1.1
The mean of X is defined to be
E [ X ] x f x ( x ) d x ,
( x )
and the mean square (second moment) of X is
Z_
E[X 2] x 2 f x ( x ) d x
oo
The variance of X is defined as
o.2 _ E[X 2] - (E[X]) 2
In many cases, we are interested in determining the statistical properties of two or more r a n d o m variables that are not independent of each other The joint probability distribution function of two random variables X and Y is defined as
Fx, v ( x , y ) - P { X < x, Y < y}
Sometimes we are given some information about one of the random variables and must estimate the distribution of the other The conditional distribution of X given
Y is denoted as
F x l y ( x l y ) - P { X <_ x l Y <_ y}
An important relation between these distributions is given by Bayes' theorem:
Fx, y ( x , y)
F x l y ( x l y )
F r ( y )
Gaussian Distribution
A r a n d o m variable X is said to follow a Gaussian distribution if its probability density function
f x ( x ) ~ e - o o < x < oo
Here, # is the mean and O "2 the variance of X In order to compute bit error rates,
we will need to compute the probability that X _> v, which is defined as the (unction
O ( v ) f x ( x ) d x
This function can be numerically evaluated For example, Q ( v ) - 10 -9 if v - 6, and
Q ( v ) - 10 -15 if v - 8
Trang 4H.1.2
H.1.3
Also if X and Y are jointly distributed Gaussian r a n d o m variables, then it can be proved that
E [ X 2 y 2] E [ X 2 ] E [ Y 2] -+- 2 ( E [ X Y ] ) 2 (H.1)
Maxwell Distribution
The Maxwellian probability density function is useful to calculate penalties due to polarization-mode dispersion A random variable X is said to follow a Maxwellian distribution if its probability density function
2 -x2/2ot 2
f x ( x ) - 3 _ x e , x > O
where ~ is a parameter associated with the distribution The mean and mean-square value of X can be computed as
E[X] = 2 ~ ~
and
E [ X 21 3or 2 37r 8 (E[X])2"
Therefore, the variance
r y 2 - E [ X 2 ] - ( E [ X ] ) 2 - o l 2 ( 3 - 8 )
It can also be shown that
P ( X > 3 E [ X ] ) ~ 4 x 10 -5
Poisson Distribution
A discrete random variable X takes on values from a discrete but possibly infinite set
S = {Xl, xz, x3 } It is characterized by a probability mass function P ( x ) , which
is the probability that X takes on a value x The expectation of a function g ( X ) is
defined as
E [ g ( X ) ] - y~ g ( x i ) P ( x i )
ilxiES
X is a Poisson r a n d o m variable if
e-rri P(i) = ~ , i = 0 , 1,2
i!
Trang 5754 RANDOM VARIABLES AND PROCESSES
H.2
where r is a parameter associated with the distribution It is easily verified that
Random Processes
R a n d o m processes are useful to model time-varying stochastic events A r a n d o m process X (t) is simply a sequence of r a n d o m variables X (tl), X (t2) one for each instant of time The first-order probability distribution function is given by
F ( x , t ) = P { X ( t ) < x},
and the first-order density function by
O F ( x , t )
Ox
The second-order distribution function is the joint distribution function
F ( x l , x2, tl, t2) = P { X ( t l ) < xl, X(t2) < x2},
and the corresponding second-order density function is defined as
f (xl, x2, tl, t2) =
OXlOX2
The mean of the process is
I_"
o o
The autocorrelation of the process is
o o O 0
The autocovariance of the process is defined as
The r a n d o m process is wide-sense s t a t i o n a r y if it has a constant mean
E [ X (t)] = l~,
and the autocorrelation (and autocovariance) depends only on r = tl - t2, that is,
R x ( r ) = E [ X ( t ) X ( t + r)] and L x ( r ) = R x ( r ) - #2 For a wide-sense stationary
Trang 6H.2.1
random process, the p o w e r s p e c t r a l d e n s i t y is the Fourier transform of the autoco- variance and is given by
S x ( f ) L x ( r ) e - i 2 r r f r d r
(x)
Note that the variance of the random process is given by
if_
~2 x - L x (o) - ~ S x ( f ) a f
In many cases, we will represent noise introduced in the system as a stationary random process In this case, the spectral density is useful to represent the spectral distribution of the noise For example, in a receiver, the noise X (t) and signal are sent through a low pass filter with impulse response h ( t ) The transfer function of the filter H ( f ) is the Fourier transform of its impulse response In this case, the spectral density of the output noise process Y(t) can be expressed as
S r ( f ) = S x ( f ) l H (f)12
Suppose the filter is an ideal low pass filter with bandwidth Be; that is, H ( f ) =
1, - B e < f < Be and 0 otherwise The variance of the noise process at its output is simply
l f _ Be S x ( f ) d f
0 -2 Z y ( O ) - ~ Be
Poisson Random Process
Poisson random processes are used to model the arrival of photons in an optical communication system They are also used widely to model the arrival of traffic in
a communication network The model is accurate primarily for voice calls, but it is used for other applications as well, without much real justification
A Poisson process X (t) is characterized by a rate parameter X For any two time instants tl and t2 > tl, X (t2) - X (tl) is the number of arrivals during the time interval (tl, t2] The number of arrivals during this interval follows a Poisson distribution; that is,
P (X(t2) - X(tl) - n) e -)~(t2-tl) ()~(t2 - tl)) n
n!
where n is a nonnegative integer Therefore, the mean number of arrivals during this time interval is
E [ X ( t 2 ) - X(tl)] = )~(t2 - q)
Trang 77 5 6 RANDOM VARIABLES AND PROCESSES
A Poisson process has many important properties that make it easier to analyze systems with Poisson traffic than other forms of traffic See [BG92] for a good summary
H.2.2 Gaussian Random Process
In many cases, we model noise as a wide-sense stationary Gaussian random process
X (t) It is also common to assume that at any two instants of time tl ~ t2 the random variables X (tl) and X (t2) are independent Gaussian variables with mean # For such
a process, we can use (H.1) and write
that is,
Further Reading
There are several good books on probability and random processes See, for example, [Pap91, Ga199]
References
[BG92] D Bertsekas and R G Gallager Data Networks Prentice Hall, Englewood Cliffs,
NJ, 1992
[Ga199] R G Gallager Discrete Stochastic Processes Kluwer, Boston, 1999
[Pap91] A Papoulis Probability, Random Variables, and Stochastic Processes, 3rd edition
McGraw-Hill, New York, 1991
Trang 8W E in the START O U T BY D E R I V I N G p i n receiver, along the lines of [BL90, RH90] It is useful to think of the an expression for the statistics of the photocurrent photodetection process in the following way Each time a photon hits the receiver, the receiver generates a small current pulse Let t~ denote the arrival times of photons
at the receiver Then the photocurrent generated can be expressed as
o o
k - - - c x ~
where e is the electronic charge and e h ( t - tk) denotes the current impulse due to a photon arriving at time tk Note that since e h ( t - tk) is the current due to a single electron, we must have
f ~ e h ( t - t k ) d t - - e
o o
The arrival of photons may be described by a Poisson process, whose rate is given
by P ( t ) / h f c Here, P ( t ) is the instantaneous optical power, and h f c is the photon energy The rate of generation of electrons may then also be considered to be a Poisson process, with rate
5~(t) - - - P ( t ) ,
e
where 7r - ~ e / h f c is the responsivity of the photodetector, ~ being the quantum efficiency
7 5 7
Trang 9758 R E C E I V E R N O I S E S T A T I S T I C S
To evaluate (I.1), let us break up the time axis into small intervals of length at, with the kth interval being [(k - 1/2)at, (k + 1/2)at) Let N~ denote the number of electrons generated during the kth interval Using these notations, we can rewrite (I.1) as
r
I (t) Z e N ~ h ( t - k a t )
k - - - ~
Note that since the intervals are nonoverlapping, the N~ are independent Poisson random variables, with rate ) ~ ( k a t ) a t
We will first compute the mean value and autocorrelation functions of the pho- tocurrent for a given optical power P(.) The mean value of the photocurrent is
E [ I ( t ) I P ( ) ] - Z e E [ N ~ ] h ( t - k a t ) - ~ e ) ~ ( k a t ) a t h ( t - k a t )
k - - o o k - - o o
In the limit when at + O, this can be rewritten as
E [ I ( t ) I P ( ) ] - e ) ~ ( r ) h ( t - r ) d r - ~ P ( r ) h ( t - r)dr
Likewise, the autocorrelation of the photocurrent can be written as
E [ I (tl)I (t2) I P (.)] f e 2 ~ ( r ) h ( t l - r)h(t2 - r ) d r
oo
-+- E [ I ( t l ) I P ( ) ] E [ I ( t z ) I P ( ) ] Z_
eT~ P ( r , ) h ( t l - r)h(t2 - r ) d r
oo
+ T~ 2 P ( r ) h ( t l - r ) d r P ( r ) h ( t 2 - r ) d r
An ideal photodetector generates pure current impulses for each received photon For such a detector h ( t ) - 8 ( t ) , where 8(t) is the impulse function with the properties that 8(t) - 0, t -r 0 and f _ ~ a ( t ) d t - 1 For this case, the mean photocurrent becomes
E [ I ( t ) i P ( ) ] - 7 g P ( t ) ,
and its autocorrelation is
E [ I ( t l ) I ( t 2 ) l P ( ) ] e T ~ P ( t l ) 8 ( t 2 - tl) + ~ 2 p ( t l ) P ( t 2 )
Trang 10The autocovariance of I ( t ) is then given as
where L p denotes the autocovariance of P ( t )
1.1 Shot Noise
-
First let us consider the simple case when there is a constant power P incident on
the receiver For this case, E [ P ( r ) ] = P and L p ( s ) = 0, and (1.2) and (1.3) can be written as
E [ I ( t ) ] = R P
and
L / ( t ) = e R P G ( t ) ,
where s = t 2 - r I The power spectral density of the photocurrent is the Fourier transform of the autocovariance and is given by
m
S I ( f ) = S_, L , ( s ) e - ' 2 ~ f ' d s = e R P
Thus the shot noise current can be thought of as being a white noise process with a flat spectral density as given here Within a receiver bandwidth of Be, the shot noise power is given by
Therefore, the photocurrent can be written as
I = T + i , $ ,
where 7 = R P and i, is the shot noise current with zero mean and variance e R P B ,