As such, it measures the average power commonality of the two sequences as a function of their separation in time.. Figure 6-4 is an example of the autocorrelation of a sequence in part
Trang 1repetitive sequence Note that the “ßip” of x(n) does not occur as it did in
Eq (5-4) for convolution We only want to compare the sequence with an
exact time-shifted replica Note also the division by N because C A(τ) is
by deÞnition a time-averaged value for each τ and convolution is not As
such, it measures the average power commonality of the two sequences as
a function of their separation in time When the shiftτ = 0, C A(τ) = C A(0) and Eq (6-12), reduces to Eq (6-5), which is by deÞnition the average
power for (x+ εx)n
Figure 6-4 is an example of the autocorrelation of a sequence in part (a) (no noise) and the identical shifted (τ = 13) sequence in part b, c There are three overlaps, and the values of the autocorrelation vs overlap, which
is the sum of partial products (polynomial multiplication), are shown in part (c) The correlation value for τ = 13 is
C A (13)= (1)(0.1875) + (0.9375)(0.125) + (0.875)(0.0625)16 = 0.0225
This value is indicated in part (c), third from the left and also third from the right This procedure is repeated for each value of τ At τ = 0, parts (a) and (b) are fully overlapping, and the value shown in part (c) is 0.365 For these two identical sequences, the maximum autocorrelation occurs
at τ = 0 and the value 0.365 is the average power in the sequence Compare Fig 6-4 with Fig 5-4 to see how circular autocorrelation
is performed We can also see that x1(n) and x2(n) have 16 positions
and the autocorrelation sequence has 33= (16 + 16 + 1) positions, which demonstrates the same smoothing and stretching effect in auto correla-tion that we saw in convolucorrela-tion As we decided in Chapter 5, the extra effort in circular correlation is not usually necessary, and we can work around it
Cross-Correlation
Two different waveforms can be completely or partially dependent or completely independent In each of these cases the two noise-contaminated waveforms are time-shifted with respect to each other in increments of τ
Trang 2N := 16 n := −N, −N + 1 N
(x1(n)⋅ x2(τ + n))
τ := −N, −N + 1 N
x1(n)
x2(n + 13)
z(τ)
0
1 − if n≥ 0
0 if n > N
x2(n) :=
n
if n≥ 0
1 −
0 if n > N
z (τ) :=
(a)
n
(b)
(c)
τ
0
0.5
1
1
0
0.5
1
0
0.1
0.2
0.3
0.4
1.0 0.9375 0.875
0.0625 0.125 0.1875
0.365
n
n= 0
1
N
n
16
Figure 6-4 Example of autocorrelation.
Trang 3different waves x(n) and y(n):
C C ( τ) = 1
N
N−1
n=0
(x+ εx ) n (y+ εy ) (n +τ)
(6-13)
We have pointed out one major difference between the correlation and convolution equations In correlation there is no “ßip” of one of the waves,
as explained in Chapter 7 This is in agreement with the desire to compare
a wave with a time-shifted replica of itself or a replica of two different waves, one of which is time-shifted with respect to the other In the case
of convolution we derived a useful relationship for the Fourier transform
of convolution In Chapter 7, correlation leads to another useful idea in linear analysis, called the Wiener-Khintchine (see Google, e.g.) principle Figure 6-5 (with no noise) is an example of cross-correlation The two time-domain sequences can have different lengths, different shapes, and different amplitude scale factors The maximum value of cross-correlation occurs at τ = − 3 and − 4, which is quite a bit different from Fig 6-4
At τ = 0 the correlation is 0.096, and at τ = − 3 and − 4 the correla-tion is about 0.149, so the correlacorrela-tion in the overlap area increases 10 log(0.149/0.096)= 1.90 dB Recall that for each value of τ the area of overlap (sum of products as in Fig 6-4) of the two sequences represents
a value of common power This value is the power that the two different
waves deliver in combination
The correlation sequences in Figs 6-4 and 6-5 are τ-domain power
sequences These power sequences can also have complex (real watts
and imaginary vars) frequency-domain components, just like any other time-domain sequence The result is a power spectrum (Chapter 7) of correlation parameter τ
AUTOCOVARIANCE
The calculation of autocorrelation can produce an average term, perhaps
dc, which may not be useful or desired for statistical analysis reasons and should be eliminated To accomplish this, autocovariance equation (6-14)
Trang 4x1(n) := 0 x2(n) :=
1− exp (n.0.25) if n ≥ 0
0 if n >
0 exp (−n ⋅0.25) if n > 0
0 if n > N
0
0.5
1
0
0.5
1
0
0.05
0.1
0.15
N 2
[x1(n) x 2(τ + n)]
z (τ) := 1 ∑nN−1= 0
N⋅
x1(n)
x2(n + 15)
n
τ
(b)
(c)
n (a)
z( τ)
Figure 6-5 Example of cross-correlation.
Trang 5to the ac value expected Many repetitions of Eq (6-14), followed by averaging, can greatly improve the accuracy This equation also leads to
an ac energy or power result as a function of τ If τ = 0, the result is the average ac signal plus noise power in the x(n) signal
C acv ( τ) = 1
N
N−1
n=0
[(x + ε) n − n x] ·(x + ε) (n +τ) − n x (6-14)
An example of autocovariance is the same as Fig 6-4, which has been modiÞed to remove the dc component
Cross-Covariance
The same modiÞcation of the cross-correlation of two separate waves,
x(n) and y(n), eliminating n x and n y, produces the cross-covariance
CCCV( τ) = 1
N
N−1
n=0
[(x+ εx )n − n x] ·(y+ εy ) (n +τ)−n y
(6-15)
The cross-covariance is the ac signal power plus noise power that is
common to x(n) and y(n) as a function of shift τ The result is the relatedness of the two ac signal powers At any value of τ the result is the total ac power for that τ Again, the result should be averaged over many repetitions
Correlation CoefÞcient
This is an important dimensionless number in statistics, in addition to those just considered Its value lies between− 1.0 and + 1.0 and it is a measure of the “relatedness” in some sense (to be decided by the user)
between two possibly noise-contaminated sequences x(n) and y(n) The
value −1.0 means “negatively” related, + 1.0 means “positively” related,
|ρxy| = 1 means completely related one way or the other, and 0 means that
x(n) and y(n) are completely unrelated (independent) The basic equation