Giannakis University of Virginia 17.1 Introduction17.2 Definitions, Properties, Representations17.3 Estimation, Time-Frequency Links, Testing Estimating Cyclic Statistics •Links with Tim
Trang 1Giannakis, G.B “Cyclostationary Signal Analysis”
Digital Signal Processing Handbook
Ed Vijay K Madisetti and Douglas B Williams Boca Raton: CRC Press LLC, 1999
Trang 217 Cyclostationary Signal Analysis
Georgios B Giannakis
University of Virginia
17.1 Introduction17.2 Definitions, Properties, Representations17.3 Estimation, Time-Frequency Links, Testing
Estimating Cyclic Statistics •Links with Time-Frequency
Rep-resentations•Testing for Cyclostationarity
17.4 CS Signals and CS-Inducing Operations
Amplitude Modulation •Time Index Modulation•Fractional
Sampling and Multivariate/Multirate Processing •Periodically
Varying Systems
17.5 Application Areas
CS Signal Extraction•Identification and Modeling
17.6 Concluding RemarksAcknowledgmentsReferences
17.1 Introduction
Processes encountered in statistical signal processing, communications, and time series analysisapplications are often assumed stationary The plethora of available algorithms testifies to the needfor processing and spectral analysis of stationary signals (see, e.g., [42]) Due to the varying nature
of physical phenomena and certain man-made operations, however, time-invariance and the relatednotion of stationarity are often violated in practice Hence, study of time-varying systems andnonstationary processes is well motivated
Research in nonstationary signals and time-varying systems has led both to the development ofadaptive algorithms and to several elegant tools, including short-time (or running) Fourier trans-forms, time-frequency representations such as the Wigner-Ville (a member of Cohen’s class of dis-tributions), Loeve’s and Karhunen’s expansions (leading to the notion of evolutionary spectra), andtime-scale representations based on wavelet expansions (see [37,45] and references therein) Adap-tive algorithms derived from stationary models assume slow variations in the underlying system
On the other hand, time-frequency and time-scale representations promise applicability to generalnonstationarities and provide useful visual cues for preprocessing When it comes to nonstationarysignal analysis and estimation in the presence of noise, however, they assume availability of multipleindependent realizations
In fact, it is impossible to perform spectral analysis, detection, and estimation tasks on signalsinvolving generally unknown nonstationarities, when only a single data record is available Forinstance, consider extracting a deterministic signals(n) observed in stationary noise v(n), using
regression techniques based on nonstationary datax(n) = s(n)+v(n), n = 0, 1, , N −1 Unless s(n) is finitely parameterized by a d θ s × 1 vector θ s(withd θ s < N), the problem is ill-posed because
Trang 3adding a new datum, sayx(n0), adds a new unknown, s(n0), to be determined Thus, only structured
nonstationarities can be handled when rapid variations are present; and only for classes of finitelyparameterized nonstationary processes can reliable statistical descriptors be computed using a singletime series One such class is that of (wide-sense) cyclostationary processes which are characterized
by the periodicity they exhibit in their mean, correlation, or spectral descriptors
An overview of cyclostationary signal analysis and applications are the main goals of this tion Periodicity is omnipresent in physical as well as manmade processes, and cyclostationarysignals occur in various real life problems entailing phenomena and operations of repetitive nature:communications [15], geophysical and atmospheric sciences (hydrology [66], oceanography [14],meteorology [35], and climatology [4]), rotating machinery [43], econometrics [50], and biologicalsystems [48]
sec-In 1961 Gladysev [34] introduced key representations of cyclostationary time series, while in 1969Hurd’s thesis [38] offered an excellent introduction to continuous time cyclostationary processes.Since 1975 [22], Gardner and co-workers have contributed to the theory of continuous-time cyclo-stationary signals, and especially their applications to communications engineering Gardner [15]adopts a “non-probabilistic” viewpoint of cyclostationarity (see [19] for an overview and also [36]and [18] for comments on this approach) Responding to a recent interest in digital periodicallyvarying systems and cyclostationary time series, the exposition here is probabilistic and focuses ondiscrete-time signals and systems, with emphasis on their second-order statistical characterizationand their applications to signal processing and communications
The material in the remaining sections is organized as follows: Section17.2provides definitions,properties, and representations of cyclostationary processes, along with their relations with stationaryand general classes of nonstationary processes Testing a time series for cyclostationarity and retrieval
of possibly hidden cycles along with single record estimation of cyclic statistics are the subjects
of Section17.3 Typical signal classes and operations inducing cyclostationarity are delineated inSection17.4to motivate the key uses and selected applications described in Section17.5 Finally,Section17.6concludes and presents trade-offs, topics not covered, and future directions
17.2 Definitions, Properties, Representations
Letx(n) be a discrete-index random process (i.e., a time series) with mean µ x (n) := E{x(n)}, and
covariancec xx (n; τ) := E{[x(n) − µ x (n)][x(n + τ) − µ x (n + τ)]} For x(n) complex valued, let
also¯c xx (n; τ) := c xx∗ (n; τ), where ∗ denotes complex conjugation, and n, τ are in the set of integers Z.
DEFINITION 17.1 Processx(n) is (wide-sense) cyclostationary (CS) iff there exists an integer P
such thatµ x (n) = µ x (n + lP ), c xx (n; τ) = c xx (n + lP ; τ), or, ¯c xx (n; τ) = ¯c xx (n + lP ; τ),
∀n, l ∈ Z The smallest of all such P s is called the period Being periodic, they all accept Fourier
Series expansions over complex harmonic cycles with the set of cycles defined as: A c
Trang 4the focus in engineering is on periodically and almost periodically correlated1time series, since realdata are often zero-mean, correlated, and with unknown distributions Almost periodicity is verycommon in discrete-time because sampling a continuous-time periodic process will rarely yield adiscrete-time periodic signal; e.g., sampling cos(ω c t + θ) every T sseconds results in cos(ω c nT s + θ) for which an integer period exists only if ω c T s = 2π/P Because 2π/(ω c T s ) is “almost an integer”
period, such signals accept generalized (or limiting) Fourier expansions (see also Eq (17.2) and [9]for rigorous definitions of almost periodic functions)
DEFINITION 17.2 Processx(n) is (wide-sense) almost cyclostationary (ACS) iff its mean and
correlation(s) are almost periodic sequences Forx(n) zero-mean and real, the time-varying and
cyclic correlations are defined as the generalized Fourier Series pair:
α k ∈A c xx
The set of cycles,A c
xx (τ) := {α k : C xx (α k ; τ) 6= 0 , −π < α k ≤ π}, must be countable and the
limit is assumed to exist at least in the mean-square sense [9, Thm 1.15]
Definition17.2and Eq (17.2) for ACS, subsume CS Definition17.1and Eq (17.1) Note thatthe latter require integer period and a finite set of cycles In theα-domain, ACS signals exhibit lines
but not necessarily at harmonically related cycles The following example will illustrate the cyclicquantities defined thus far:
EXAMPLE 17.1: Harmonic in multiplicative and additive noise
Let
wheres(n), v(n) are assumed real, stationary, and mutually independent Such signals appear when
communicating through flat-fading channels, and with weather radar or sonar returns when, inaddition to sensor noisev(n), backscattering, target scintillation, or fluctuating propagation media
give rise to random amplitude variations modeled bys(n) [33] We will consider two cases:Case 1:µ s 6= 0 The mean in (17.3) isµ x (n) = µ scos(ω0n) + µ v, and the cyclic mean:
Trang 5Signal x(n) in (17.3) is thus (first-order) cyclostationary with set of cyclesA c
x = {±ω0, 0} If
X N (ω) :=PN−1 n=0 x(n) exp(−jωn), then from (17.4) we findC x (α) = lim N→∞ N−1E{X N (α)};
thus, the cyclic mean can be interpreted as an averaged DFT andω0can be retrieved by picking thepeak of|X N (ω)| for ω 6= 0.
Case 2: µ s = 0 From (17.3) we find the correlationc xx (n; τ) = c ss (τ)[cos(2ω0n + ω0τ) +
cos(ω0τ)]/2 + c vv (τ) Because c xx (n; τ) is periodic in n, x(n) is (second-order) CS with cyclic
The set of cycles isA c
xx (τ) = {±2ω0, 0} provided that c ss (τ) 6= 0 and c vv (τ) 6= 0 The set A c
xx (τ)
is lag-dependent in the sense that some cycles may disappear while others may appear for different
τs To illustrate the τ -dependence, let s(n) be an MA process of order q Clearly, c ss (τ) = 0 for
|τ| > q, and thus A c
xx (τ) = {0} for |τ| > q.
The CS process in (17.3) is just one example of signals involving products and sums of stationaryprocesses such ass(n) with (almost) periodic deterministic sequences d(n), or, CS processes x(n).
For such signals, the following properties are useful:
Property 1 Finite sums and products of ACS signals are ACS If x i (n) is CS with period P i , then for λ i constants, y1(n) :=PI1
i=1 λ i x i (n) and y2(n) :=QI2
i=1 λ i x i (n) are also CS Unless cycle cancellations occur among x i (n) components, the period of y1(n) and y2(n) equals the least common multiple of the P i s Similarly, finite sums and products of stationary processes with deterministic (almost) periodic signals are also ACS processes.
As examples of random–deterministic mixtures, consider
wheres(n) is zero-mean, stationary, and d(n) is deterministic (almost) periodic with Fourier Series
coefficientsD(α) Time-varying correlations are, respectively,
c x1x1(n; τ) = c ss (τ) + d(n)d(n + τ) and c x2x2(n; τ) = c ss (τ)d(n)d(n + τ) (17.8)Both are (almost) periodic inn, with cyclic correlations
C x1x1(α; τ) = c ss (τ)δ(α) + D2(α; τ) and C x2x2(α; τ) = c ss (τ)D2(α; τ) , (17.9)whereD2(α; τ) = Pβ D(β)D(α − β) exp[j (α − β)τ], since the Fourier Series coefficients of
the productd(n)d(n + τ) are given by the convolution of each component’s coefficients in the α-domain To reiterate the dependence on τ, notice that if d(n) is a periodic ±1 sequence, then
c x2x2(n; 0) = c ss (0)d2(n) = c ss (0), and hence periodicity disappears at τ = 0.
ACS signals appear often in nature with the underlying periodicity hidden, unknown, or sible In contrast, CS signals are often man-made and arise as a result of, e.g., oversampling (by aknown integer factorP ) digital communication signals, or by sampling a spatial waveform with P
inacces-antennas (see also Section17.4)
Both CS and ACS definitions could also be given in terms of the Fourier Transforms (τ → ω)
ofc xx (n; τ) and C xx (α; τ), namely the time-varying and the cyclic spectra which we denote by
S xx (n; ω) and S xx (α; ω) Suppose c xx (n; τ) and C xx (α; τ) are absolutely summable w.r.t τ for all
Trang 6Absolute summability w.r.t τ implies vanishing memory as the lag separation increases, and many
real life signals satisfy these so called mixing conditions [5, Ch 2] Power signals are not absolutelysummable, but it is possible to define cyclic spectra equivalently [for real-valuedx(n)] as
unless|ω1± ω2| = 0 (mod 2π) [5, Ch 4] Specifically, we have from (17.12) that:
Property 2 If x(n) is ACS or CS, the N-point Fourier transform X N (ω1) is correlated with X N (ω2) for
|ω1± ω2| = α k (mod 2π), and α k ∈ A s
xx .
Before dwelling further on spectral characterization of ACS processes, it is useful to note the sity of tools available for processing Stationary signals are analyzed with time-invariant correlations(lag-domain analysis), or with power spectral densities (frequency-domain analysis) However, CS,ACS, and generally nonstationary signals entail four variables:(n, τ, α, ω) :=(time, lag, cycle, fre-
diver-quency) Grouping two variables at a time, four domains of analysis become available and theirrelationship is summarized in Fig.17.1 Note that pairs(n; τ) ↔ (α; τ), or, (n; ω) ↔ (α; ω), have
τ or ω fixed and are Fourier Series pairs; whereas (n; τ) ↔ (n; ω), or, (α; τ) ↔ (α; ω), have n or
α fixed and are related by Fourier Transforms Further insight on the links between stationary and
FIGURE 17.1: Four domains for analyzing cyclostationary signals
cyclostationary processes is gained through the uniform shift (or phase) randomization concept Let
Trang 7x(n) be CS with period P , and define y(n) := x(n + θ), where θ is uniformly distributed in [0, P )
and independent ofx(n) With c yy (n; τ) := E θ {E x [x(n + θ)x(n + τ + θ)]}, we find:
Such a mapping is often used with harmonic signals; e.g.,x(n) = A exp[j (2πn/P + θ)] + v(n) is
according to Property 2 a CS signal, but can be stationarized by uniform phase randomization Analternative trick for stationarizing signals which involve complex harmonics is conjugation Indeed,
c xx∗ (n; τ) = A2exp(−j2πτ/P ) + c vv (τ) is not a function of n — but why deal with CS or ACS
processes if conjugation or phase randomization can render them stationary?
Revisiting Case 2 of Example17.1offers a partial answer when the goal is to estimate the frequency
ω0 Phase randomization of x(n) in (17.3) leads to a stationary y(n) with correlation found by
substitutingα = 0 in (17.6) This leads toc yy (τ) = (1/2)c ss (τ) cos(ω0τ) + c vv (τ), and shows that
ifs(n) has multiple spectral peaks, or if s(n) is broadband, then multiple peaks or smearing of the
spectral peak hamper estimation ofω0(in fact, it is impossible to estimateω0from the spectrum of
y(n) if s(n) is white) In contrast, picking the peak of C xx (α; τ) in (17.6) yieldsω0, provided that
ω0 ∈ (0, π) so that spectral folding is prevented [33] Equation (17.13) provides a more generalanswer Phase randomization restricts a CS process only to one cycle, namelyα = 0 In other words,
the cyclic correlationC xx (α; τ) contains the “stationarized correlation” C xx (0; τ) and additional
axes at equidistant points 2π/P far apart from each other More specifically, we have [34]:
2 Nonstationary processes with Fourier transformable 2-D correlations are called harmonizable processes.
Trang 8FIGURE 17.2: Support of 2-D spectrumS xx (ω1, ω2) for CS processes.
Property 5 A CS process with period P is a special case of a nonstationary (harmonizable) process with 2-D spectral density given by
For stationary processes, only thek = 0 term survives in (17.15) and we obtainS xx (ω1, ω2) =
S xx (0; ω1)δ D (ω2 −ω1); i.e., the spectral mass is concentrated on the diagonal of Fig.17.2 Thewell-structured spectral support for CS processes will be used to test for presence of cyclostationarityand estimate the periodP Furthermore, the superposition of lines parallel to the diagonal hints
towards representing CS processes as a superposition of stationary processes Next we will examinetwo such representations introduced by Gladysev [34] (see also [22,38,49], and [56])
We can uniquely writen0= nP + i and express x(n0) = x(nP + i), where the remainder i takes
values 0, 1, , P −1 For each i, define the subprocess x i (n) := x(nP +i) In multirate processing,
theP × 1 vector x(n) := [x0(n) x P −1 (n)]0constitutes the so-called polyphase decomposition of
x(n) [51, Ch 12] As shown in Fig.17.3, eachx i (n) is formed by downsampling an advanced copy
We maintain that subprocesses{x i (n)} P −1 i=0 are (jointly) stationary, and thus x(n) is vector stationary.
Suppose for simplicity thatE{x(n)} = 0, and start with E{x i1(n)x i2(n+τ)} = E{x(nP +i1)x(nP +
τP + i2)} := c xx (i1+ nP ; i2− i1+ τP ) Because x(n) is CS, we can drop nP and c xx becomesindependent ofn establishing that x i1(n), x i2(n) are (jointly) stationary with correlation:
c x i1 x i2 (τ) = c xx (i1; i2− i1+ τP ) , i1, i2 ∈ [0, P − 1] (17.17)
Trang 9FIGURE 17.3: Representation 1: (a) analysis, (b) synthesis.
Using (17.17), it can be shown that auto- and cross-spectra ofx i1(n), x i2(n) can be expressed in terms
of the cyclic spectra ofx(n) as [56],
S x i1 x i2 (ω)e jω(i2−i1) e −j2P π ki2 . (17.19)
Based on (17.16) through (17.19), we infer that cyclostationary signals with periodP can be analyzed
as stationaryP × 1 multichannel processes and vice versa In summary, we have:
Representation 1 (Decimated Components) CS process x(n) can be represented as a P -variate
sta-tionary multichannel process x (n) with components x i (n) = x(nP + i), i = 0, 1, , P − 1 Cyclic spectra and stationary auto- and cross-spectra are related as in ( 17.18 ) and ( 17.19 ).
An alternative means of decomposing a CS process into stationary components is by splitting the
(−π, π] spectral support of X N (ω) into bands each of width 2π/P [22] As shown in Fig.17.4, thiscan be accomplished by passing modulated copies ofx(n) through an ideal low-pass filter H0(ω) with
spectral support(−π/P, π/P ] The resulting subprocesses ¯x m (n) can be shifted up in frequency
and recombined to synthesize the CS process as: x(n) =PP −1 m=0 ¯x m (n) exp(−j2πmn/P ) Within
each band, frequencies are separated by less than 2π/P and according to Property 2, there is no
correlation between spectral components ¯X m,N (ω1) and ¯X m,N (ω2); hence, ¯x m (n) components are
stationary with auto- and cross-spectra having nonzero support over−π/P < ω < π/P They are
related with the cyclic spectra as follows:
Trang 10FIGURE 17.4: Representation 2: (a) analysis, (b) synthesis.
cross-spectra of ¯x m (n) can be found from the cyclic spectra of x(n) as in ( 17.20 ).
Because ideal low-pass filters cannot be designed, the subband decomposition seems less practical.However, using Representation 1 and exploiting results from uniform DFT filter banks, it is possibleusing FIR low-pass filters to obtain stationary subband components (see e.g., [51, Ch 12]) We willnot pursue this approach further, but Representation 1 will be used next for estimating time-varyingcorrelations of CS processes based on a single data record
17.3 Estimation, Time-Frequency Links, Testing
The time-varying and cyclic quantities introduced in (17.1), (17.2), and (17.10) through (17.12),entail ideal expectations (i.e., ensemble averages) and unless reliable estimators can be devised fromfinite (and often noisy) data records, their usefulness in practice is questionable For stationaryprocesses with (at least asymptotically) vanishing memory,3sample correlations and spectral densityestimators converge to their ensembles as the record lengthN → ∞ Constructing reliable (i.e.,
consistent) estimators for nonstationary processes, however, is challenging and generally impossible.Indeed, capturing time-variations calls for short observation windows, whereas variance reductiondemands long records for sample averages to converge to their ensembles
Fortunately, ACS and CS signals belong to the class of processes with “well-structured” variations that under suitable mixing conditions allow consistent single record estimators The key
time-is to note that althoughc xx (n; τ) and S xx (n; ω) are time-varying, they are expressed in terms of
cyclic quantities,C xx (α k ; τ) and S xx (α k ; ω), which are time-invariant Indeed, in (17.2) and (17.10)time-variation is assigned to the Fourier basis
3 Well-separated samples of such processes are asymptotically independent Sufficient (so-called mixing) conditions include absolute summability of cumulants and are satisfied by many real life signals (see [ 5 , 12 , Ch 2]).
Trang 1117.3.1 Estimating Cyclic Statistics
First we will consider ACS processes with known cyclesα k Simpler estimators for CS processes and cle estimation methods will be discussed later in the section Ifx(n) has nonzero mean, we estimate the
cy-cyclic mean as in Example17.1using the normalized DFT: ˆC xx (α k ) = N−1PN−1
If the set of cycles is finite, we estimate the time-varying mean as: ˆc xx (n) =Pα k ˆC xx (α k ) exp(jα k n).
Similarly, for zero-mean ACS processes we estimate first cyclic and then time-varying correlationsusing:
Note that ˆC xxcan be computed efficiently using the FFT of the productx(n)x(n + τ).
For cyclic spectral estimation, two options are available: (1) smoothed cyclic periodograms and(2) smoothed cyclic correlograms The first is motivated by (17.12) and smooths the cyclic peri-odogram,I xx (α; ω) := N−1X N (ω)X N (α − ω), using a frequency-domain window W(ω) The
second follows (17.2) and Fourier transforms ˆC xx (α; τ) after smoothing it by a lag-window w(τ)
with supportτ ∈ [−M, M] Either one of the resulting estimates:
12,24,39] and references therein
Whenx(n) is CS with known integer period P , estimation of time-varying correlations and spectra
becomes easier Recall that thanks to Representations 1 and 2, not onlyc xx (n; τ) and S xx (n; ω), but
the processx(n) itself can be analyzed into P stationary components Starting with (17.16), it can
be shown thatc xx (i; τ) = c x i x i+τ (0), where i = 0, 1, , P − 1 and subscript i + τ is understood
mod(P ) Because the subprocesses x i (n) and x i+τ (n) are stationary, their cross-covariances can be
estimated consistently using sample averaging; hence, the time-varying correlation can be estimatedas:
Trang 12where the integer part[N/P ] denotes the number of samples per subprocess x i (n), and the last
equal-ity follows from the definition ofx i (n) in Representation 1 Similarly, the time-varying periodogram
can be estimated using: I xx (n; ω) = P−1PP −1
then smoothed to obtain a consistent estimate ofS xx (n; ω).
17.3.2 Links with Time-Frequency Representations
Consistency (and hence reliability) of single record estimates is a notable difference between stationary and time-frequency signal analyses Short-time Fourier transforms, the Wigner-Ville,and derivative representations are valuable exploratory (and especially graphical) tools for analyz-ing nonstationary signals They promise applicability on general nonstationarities, but unless slowvariations are present and multiple independent data records are available, their usefulness in es-timation tasks is rather limited In contrast, ACS analysis deals with a specific type of structuredvariation, namely (almost) periodicity, but allows for rapid variations and consistent single recordsample estimates Intuitively speaking, cyclostationarity provides within a single record, multipleperiods that can be viewed as “multiple realizations.” Interestingly, for ACS processes there is a closerelationship between the normalized asymmetric ambiguity functionA(α; τ) [37], and the samplecyclic correlation in (17.21):
τ=−(N−1) x(n) x(n+τ) exp(−jωτ) In fact, the aforementioned equivalences and the consistency
results of [12] establish that ambiguity and Wigner-Ville processing of ACS signals is reliable evenwhen only a single data record is available The following example uses a chirp signal to stress thispoint and shows how some of our sample estimates can be extended to complex processes
EXAMPLE 17.2: Chirp in multiplicative and additive noise
Consider x(n) = s(n) exp(jω0n2) + v(n), where s(n), v(n), are zero mean, stationary, and
mutually independent;c xx (n; τ) is nonperiodic for almost every ω0, and hencex(n) is not
(second-order) ACS Even whenE{s(n)} 6= 0, E{x(n)}isalsononperiodic, implyingthatx(n)isnotfirst-order
ACS either However,
˜c xx∗ (n; τ) := c xx∗ (n + τ; −2τ) := E{x(n + τ)x∗(n − τ)}
exhibits (almost) periodicity and its cyclic correlation is given by: ˜C xx∗ (α; τ) = c ss (τ)δ(α−4ω0τ)+
c vv∗ (2τ)δ(α) Assuming c ss (τ) 6= 0, the latter allows evaluation of ω0by picking the peak of thesample cyclic correlation magnitude evaluated at, e.g.,τ = 1, as follows:
The ˆ˜C xx∗ (α; τ) estimate in (17.27) is nothing but the symmetric ambiguity function Becausex(n)
is ACS, ˆ˜C xx∗can be shown to be consistent This provides yet one more reason for the success of
Trang 13time-frequency representations with chirp signals Interestingly, (17.27) shows that exploitation ofcyclostationarity allows not only for additive noise tolerance [by avoiding theα = 0 cycle in (17.27)],but also permits parameter estimation of chirps modulated by stationary multiplicative noises(n).
17.3.3 Testing for Cyclostationarity
In certain applications involving man-made (e.g., communication) signals, presence of arity and knowledge of the cycles is assured by design (e.g., baud rates or oversampling factors) Inother cases, however, only a time series{x(n)} N−1 n=0 is given and two questions arise: How does onedetect cyclostationarity, and ifx(n) is confirmed to be CS of a certain order, how does one estimate
cyclostation-the cycles present? The former is addressed by testing hypocyclostation-theses of nonzero ˆC x (α k ), ˆC xx (α k ; τ) or
ˆS xx (α k ; ω) over a fine cycle-frequency grid obtained by sufficient zero-padding prior to taking the
FFT
Specifically, to test whetherx(n) exhibits cyclostationarity in { ˆC xx (α; τ l )} L
l=1for at least one lag,
we form the(2L + 1) × 1 vector ˆc xx (α) := [ ˆC R
xx (α; τ1) ˆC R
xx (α; τ L ); ˆC I
xx (α; τ1) ˆC I
xx (α; τ L )]0
where superscriptR(I) denotes real (imaginary) part Similarly, we define the ensemble vector
cxx (α) and the error e xx (α) := ˆc xx (α) − c xx (α) For N large, it is known that√N e xx (α) is
Gaussian with pdfN (0, 6 c ) An estimate ˆ6 cof the asymptotic covariance can be computed fromthe data [12] Ifα is not a cycle for all {τ l}L
l=1, then cxx (α) ≡ 0, e xx (α) = ˆc xx (α) will have zero
mean, and ˆD2c (α) := ˆc0
xx (α) ˆ6†
c (α)ˆc xx (α) will be central chi-square For a given false-alarm rate,
we find fromχ2tables a threshold0 and test [10]
EXAMPLE 17.3: Cyclostationarity test
Considerx(n) = s1(n) cos(πn/8) + s2(n) cos(πn/4) with s1(n), s2(n), and v(n) zero-mean,
Gaussian, and mutually independent To test for cyclostationarity and retrieve the possible periodspresent,N = 2, 048 samples were generated; s1(n) and s2(n) were simulated as AR(1) with variances
σ2
s1 = σ2
s2 = 2, while v(n) was white with variance σ2
v = 0.1 Figure17.5a shows| ˆC xx (α; 0)|
peaking atα = ±2(π/8), ±2(π/4), 0 as expected, while Fig.17.5b depictsρ xx (ω1, ω2) computed as
in (17.29) withM = 64 The parallel lines in Fig.17.5b are seen at|ω1−ω2| = 0, π/8, π/4 revealing
the periods present One can easily verify from (17.11) thatC xx (α; 0) = (2π)−1Rπ
using the FFT ofx2(n), ρ xx (ω1, ω2) is generally more informative.
Because cyclostationarity is lag-dependent, as an alternative toρ xx (ω1, ω2) one can also plot
| ˆC xx (α; τ)| or | ˆS xx (α; ω)| for all τ or ω Figures17.6and17.7show perspective and contour plots
Trang 14FIGURE 17.5: (a) Cyclic cross-correlationC xx (α; 0), and (b) coherence ρ xx (ω1, ω2) (Example17.3).
of| ˆC xx (α; τ)| for τ ∈ [−31, 31] and | ˆS xx (α; ω)| for ω ∈ (−π, π], respectively Both sets exhibit
planes (lines) parallel to theτ-axis and ω-axis, respectively, at cycles α = ±2(π/8), ±2(π/4), 0, as
expected
FIGURE 17.6: Cycle detection and estimation (Example17.3): 3D and contour plots of ˆC xx (α; τ).
17.4 CS Signals and CS-Inducing Operations
We have already seen in Examples17.1and17.2that amplitude or index transformations of repetitivenature give rise to one class of CS signals A second category consists of outputs of repetitive (e.g.,periodically varying) systems excited by CS or even stationary inputs Finally, it is possible to have
Trang 15FIGURE 17.7: Cycle detection and estimation (Example17.3): 3D and contour plots of ˆS xx (α; ω).
cyclostationarity emerging in the output due to the data acquisition process (e.g., multiple sensors
or fractional sampling)
17.4.1 Amplitude Modulation
General examples in this class include signalsx1(n) and x2(n) of (17.7) or their combinations asdescribed by Property 1 More specifically, we will focus on communication signals where random(often i.i.d.) information dataw(n) are D/A converted with symbol period T0, to obtain the process:
w c (t) = Pl w(l)δ D (t − lT0), which is CS in the continuous variable t The continuous-time
signalw c (t) is subsequently pulse shaped by the transmit filter h (tr) c (t), modulated with the carrier
exp(jω c t), and transmitted over the linear time-invariant (LTI) channel h (ch) c (t) On reception, the
carrier is removed and the data are passed through the receive filterh (rec) c (t) to suppress stationary
additive noise Defining the composite channelh c (t) := h (tr) c ? h (ch) c ? h (rec) c (t), the continuous time
received signal at the baseband is:
Ifω e = 0, x(n) (and thus v(n)) is stationary, whereas ω e 6= 0 renders r(n) similar to the ACS
signal in Example17.1 Whenw(n) is zero-mean, i.i.d., complex symmetric, we have: E{w(n)} ≡ 0,
However, peak-picking the cyclic fourth-order correlation [Fourier coefficients ofr4(n)] yields 4ω e
Trang 16uniquely, providedω e < π/4 If E{w4(n)} ≡ 0, higher powers can be used to estimate and recover
ω e
Having estimatedω e, we form exp(−jω e n) r(n) in order to demodulate the signal in (17.31).Traditionally, cyclostationarity is removed from the discrete-time information signal, although itmay be useful for other purposes (e.g., blind channel estimation) to retain cyclostationarity at thebaseband signalx(n) This can be accomplished by multiplying w(n) with a P -periodic sequence p(n) prior to pulse shaping The noise-free signal in this case is x(n) =Pl p(l)w(l)h(n − l), and
has correlation,¯c xx (n; τ) = σ2
wP
l |p(n−l)|2h(l)h∗(l +τ), which is periodic with period P Cyclic
correlations and spectra are given by [28]
m=0 |p(m)|2exp(−jαm) and H (ω) := PL l=0 h(l) exp(−jωl) As we will
see later in this section, cyclostationarity can also be introduced at the transmitter using multirateoperations, or at the receiver by fractional sampling With a CS input, the channelh(n) can be
identified using noisy output samples only [28,64,65] — an important step towards blind equalization
of (e.g., multipath) communication channels
p(n)s(n)+v(n) can be used to model systematically missing observations Periodically, the stationary
signals(n) is observed in noise v(n) for P1samples and disappears for the nextP − P1data Using
C xx (α; τ) = P2(α; τ)c ss (τ), the period P [and thus P2(α; τ)] can be determined Subsequently,
c ss (τ) can be retrieved and used for parametric or nonparametric spectral analysis of s(n); see [32]and references therein
17.4.2 Time Index Modulation
Suppose that a random CS signals(n) is delayed by D samples and received in zero-mean stationary
noisev(n) as: x(n) = s(n − D) + v(n) With s(n) independent of v(n), the cyclic correlation is
C xx (α; τ) = C ss (α; τ) exp(jαD)+δ(α)c vv (τ) and the delay manifests itself as a phase of a complex
exponential But even whens(n) models a narrowband deterministic signal, the delay appears in the
exponent sinces(n − D(n)) ≈ s(n) exp(jD(n)) [53] Time-delay estimation of CS signals appearsfrequently in sonar and radar for range estimation whereD(n) = νn and ν denotes velocity of
propagation.D(n) is also used to model Doppler effects that appear when relative motion is present.
Note that with time-varying (e.g., accelerating) motion we haveD(n) = γ n2and cyclostationarityappears in the complex correlation as explained in Example17.2
Polynomial delays are one form of time scale transformations Another one isd(n) = λn + p(n),
whereλ is a constant and p(n) is periodic with period P (e.g., [38]) For stationarys(n), signal x(n) = s[d(n)] is CS because c xx (n + lP ; τ) = c ss [d(n + lP + τ) − d(n + lP )] = c ss [λτ + p(n) −
p(n + τ)] = c xx (n; τ) A special case is the familiar FM model with d(n) = ω c n + h sin(ω0n) where
h here denotes the modulation index The signal and its periodically varying correlation are given
by:
In addition to communications, frequency modulated signals appear in sonar and radar when rotatingand vibrating objects (e.g., propellers or helicopter blades) induce periodic variations in the phase ofincident narrowband waveforms [2,67]