1. Trang chủ
  2. » Kỹ Thuật - Công Nghệ

Mô hình hóa và nhận dạng hệ thống - random process 2

11 475 0
Tài liệu đã được kiểm tra trùng lặp

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Tiêu đề Mô Hình Hóa Và Nhận Dạng Hệ Thống - Random Process 2
Trường học University of Science
Chuyên ngành Stochastic Processes
Thể loại Bài Giảng
Thành phố Ho Chi Minh City
Định dạng
Số trang 11
Dung lượng 403,74 KB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

Xác định ngõ vào, ngõ ra của hệ thống cần nhận dạng ⇒ xác định tín hiệu “kích thích“ để thực hiện thí nghiệm thu thập số liệu và vị trí đặt cảm biến để đo tín hiệu ra. Chọn tín hiệu

Trang 1

Lecture 2

Stochastic Processes

Spectral analysis is the study of models of a class called stationary stochastic processes

Stochastic processes {X(t) : t ∈ T } is a family or rv’s indexed by a variable t, where t is a subset of T which may be infinite.

continuous → X(t)

discrete → X t

 These may be real, vector valued or complex with suitable added indices

We will use the Riemann-Stieltjes notation in what follows because mixed continuous- discrete distributions are common for time series, and hence the R-S

notation is standard in stochastic theory Let g(x) and H(x) be real valued func-tions on [L, U] where L < U and L, U may be −∞, ∞ with suitable limiting processes Let P N be a partition of [L, U ] into N + 1 intervals

L = x0 < x1· · · < x N = U

Define the mesh fineness:

|P N | = max {x1− x0, x2− x1, , x N − x N −1 }

1

Trang 2

Then Z

U L

g(x)dH(x) = lim

|P N |→0

N

X

j=1

g(x 0

j )[H(x j ) − H(x j−1)]

where x 0

j ∈ [x j−1 , x j] There are 3 cases

1 If H(x) = x then we have the Riemann integral RL U g(x)dx

2 If H(x) is continuously differentiable on [L, U ] with h(x) = ∂x H(x), then

Z U

L

g(x)dH(x) =

Z U

L

g(x)h(x)dx

3 If H(x) undergoes step changes of size b i at a i on [L, U ] so that

H(x) = c i L ≤ x < a i

H(x) = c i + b i a i ≤ x ≤ U

ci

ai

c + b

i i

bi

U L

g(x)dH(x) =

N

X

j=1

b i g(a i)

Example 1 For a continuous process, we have

f (x) = ∂ x F (x) and hence:

E[X] =

Z

−∞

xdF (x) =

Z

−∞

xf (x)dx

Example 2 For a discrete process where the cdf F (x) undergoes discrete jumps

of size 1

N at a set of values {x i }:

E[X] =

Z

−∞

xdF (x) = 1

N

N

X

j=1

x i

Example 3 For a fixed value of t, Xt is an rv and hence has a cdf, where

F t (a) = P [X t ≤ a]

Trang 3

2.1 INTRODUCTION 3

with

E[X] =

Z

−∞

xdF (x) = µ t var[X] =

Z

−∞

(x − µt)dF (x) = σ t2

Note that the statistics become time dependent We also may need higher order

cdf’s like the bivariate for two times:

F t1,t2(a1, a2) = P [Xt1 ≤ a1, X t2 ≤ a2]

and the N dimensional generalization:

F t1, ,t N (a1, , a N ) = P [X t1 ≤ a1, , X t N ≤ a N]

The set of cdf’s from F t to F t1, ,t N are a complete description of the stochastic

process if we know them for all t and N However, the result is a mess and the

distributions are unknowable in practice

We can start to narrow this down by considering stationary processes: one whose statistical properties are independent of time, or a physical system which is steady state

If {X t } is a result of a stationary process, then each element must have the

same cdf and F t (x) → F (x) Any pair of elements in {X t } must have the same

bivariate distribution, etc In summary, the joint cdf of {X t } for a set of N time

points {t i } must be unaltered by time shifts.

There are several cases of stationarity:

Complete stationarity

If the joint cdf of {X t1, , X t N } is identical to that for {X t k+1 , , X t k+N } for

any k, then it is completely stationary All of the statistical structure is unchanged

under shifts in the time origin This is a severe requirement and rarely establishable

in practice

Stationarity of order 1

Trang 4

E[X t] = µ for ∀t No other stationarity is implied.

Stationarity of order 2

E[X t] = µ and E[X2

t ] = µ2, so that the mean and the variance are time independent

E[X t X s] is a function of |t−s| only and hence cov[Xt , X s] is a function of |t−s|

only

This class is called weakly stationary or second order stationary, and is the most important type of stochastic process for our purposes

For a second order stationary process, we define the autocovariance sequence by

S τ = cov[Xt , X t+τ ] = cov[X0, X τ] This is a measure of the covariance between members of the process separated

by τ time units τ is called the lag We would expect S τ to be largest at τ = 0

and be symmetric about the origin

1 S0 = σ2

2 S −τ = S τ (even function)

3 |S τ | ≤ S0 for τ > 0

4 S τ is positive semidefinite

PN

j=1

PN

k=1 S t j −t k a j a k ≥ 0 for {a1, , a N } ∈ <

or in matrix form ~a T ↔ Σ~a ≥ 0

where Σ is the covariance matrix

The autocorrelation sequence is the acvs normalized to S0

ρ τ = S τ

S0

Trang 5

2.3 EXAMPLES OF STATIONARY PROCESSES 5 and has properties:

1 ρ0 = 1

2 ρ −τ = ρ τ for τ > 0

3 |ρ τ | ≤ 1 for τ > 0

4 ρ τ is positive semidefinite

Note that a completely stationary process is also second order stationary, but second order stationarity does not imply complete stationarity However, if the

process is Gaussian (i.e, the joint cdfs of the rv’s are multivariate normal) then

second order stationarity does imply complete stationarity because a Gaussian distribution is completely specified by its first and second moments

All of this machinery extends to complex processes Let Zt = Xt,1 + iXt,2 This

is second order stationary if all of the joint first and second order moments of X t,1 and Xt,2 exist, are finite, and are invariant to shifts in time This implies that Xt,1 and X t,2 are themselves second order stationary We have

E[Z t] = µ1+ iµ2 = µ cov[Zt1, Z t2] = E[(Zt1 − µ) ∗ (Zt2 − µ)] = S τ and hence S −τ = S ∗

τ for a complex process

Let {X t } be a sequence of uncorrelated rv’s such that

E[X t ] = µ var[X t ] = σ2

cov[X t , X t+τ] = 0 (follows from uncorrelatedness)

Trang 6

Then {Xt } is stationary with acvs

S τ =

σ2, τ = 0;

0, τ 6= 0.

Note that a sequence of uncorrelated rv’s are not necessarily independent, but independence does imply uncorrelatedness Independence implies that the joint

cdf may be factored into the product of individual cdf’s, and we have not applied

this condition The exception for these statements would be a Gaussian process where uncorrelatedness does imply independence

A random or white noise process is a process without memory One datum does not depend on any other

Example 4 Consider a particle of unit mass moving in a straight line and subject

to a random force Let X t denote the particle velocity at time t and ² t denote the random force per unit mass acting on it Then

˙

X t = ²t − αX t

if the resistive force is proportional to velocity from Newton’s laws ˙ X t ≈ X t −X t−1 and hence:

X t= 1

1 + α (² t + X t−1)

= ² 0

t + α 0 X t−1

This is a first order AR process where the value of the rv at the time t depends

on that at time t − 1 but not at earlier times.

X t − aX t−1 = ² t

Trang 7

2.4 FIRST ORDER AUTOREGRESSIVE PROCESS 7

where a is a constant and {²t } is random This is analogous to linear regression

with X t depending linearly on X t−1 and ² t being the residual, hence the term

“autoregressive”

The difference equation can be solved assuming X0 = 0 yielding

X t = ² t + a² t−1 + a2² t−2 + · · · + a t−1 ²1

if E[² t ] = µ then:

E[X t ] = µ(1 + a + · · · + a t−1)

=

µ

³

1−a t

1−a

´

, a 6= 1;

µt, a = 1.

If µ = 0, this vanishes and Xtis first order stationary and otherwise is not However

if |a| < 1 then

E[X t] ≈ µ

1 − a (t → ∞) and hence X t is asymptotically first order stationary

If var[² t ] = σ2 and cov(² t ² s) = 0, we have

var[X t] =

σ

1−a 2t

1−a2

´

, a 6= 1;

σ2t, a = 1.

cov[Xt X t+r] =

σ2a r³

1−a 2t

1−a2

´

, |a| 6= 1;

σ2t, |a| = 1.

This is not second order stationary unless σ2 = 0 but it is asymptotically so if

|a| < 1

τ S τ

Trang 8

The AR process easily generalizes to an order p

X t + a1X t−1 + · · · + a p X t−p = ² t

Let z denote the unit delay operator Then

(1 + a1z + · · · + a p z p )Xt = ²t

This is asymptotically stationary if the roots of the z polynomial lie inside a circle

of radius one

An AR process is a finite linear combination of its past values and the current

value of a random process The present noise value ² t is drawn into the process

and hence influences the present and all future values X t + X t−1 , This can be

shown by recursively solving the AR(p) equation

X t=

X

j=0

θ j ² t−j , with θ0 = 1

This shows why the acvs for an AR process dies out gradually with lag and never reaches zero

An MA process is a linear combination of present and past values of a noise process with a finite extent

X t = b0² t + b1² t−1 + · · · + bp ² t−p

A given noise term ² t influences only p future values of X and hence the acvs for

an MA process will vanish beyond some finite value of lag

cov[X t X t+τ] =

p

X

j=0

p

X

k=0

b j b k E[² t−j ² t+τ −k ] = σ2

p−τ

X

j=0

b j b j+τ

Trang 9

2.6 ERGODIC PROPERTY 9

where var[²t] = σ2 Since cov[Xt X t−τ ] = cov[Xt X t+τ], an MA process is stationary with acvs:

S τ =

σ2Pp−|τ |

j=0 b j b j+|τ | , |τ | ≤ p;

0 |τ | > p.

(2.1)

There are no restrictions on the size of b j

It can be shown that an AR(p) process is equivalent to an infinite order MA process, and vice versa

Mixed AR + MA processes, called ARMA processes are also in existence Spectral estimators exist which are based on AR, MA and ARMA models These are called parametric estimators because their result is dependent on the model, i.e AR of order p, etc AR models are also called maximum entropy None

of these work satisfactorily with geophysical data except in pathological cases The problem is that no test exists to determine which model or what order is appropriate Failure to use the correct model/order gives wildly wrong answers,

as shown on the next page

Note: In the MATLAB online help is stated that the parametric methods give better results for the estimation of the spectrum That is based on an example that is shown there and that it represents an AR model, logically the parametric methods will be better in this case than the non-parametric

More nonsense has been written about the superior resolving power of AR or MEM than anything else in geophysics (see any issue of JGR in the 1970’s) As

an example see figure (2.1)

Estimation of the mean or acvs using observations from a single realization are based on replacing ensemble averages with time averages Estimates which

Trang 10

MA

ARMA

Figure 2.1: Example of AR (top figure), MA (middle) and ARMA (bottom) models (solid lines) and their approximation by AR, MA and ARMA models Note how without any knowledge of the process, this parametric methods fail to recover the real spectrum

Trang 11

2.6 ERGODIC PROPERTY 11

“converge” under this interchange are called ergodic The ergodic assumption is typically applied without justification in all of spectral analysis

Ngày đăng: 16/10/2012, 09:09

TỪ KHÓA LIÊN QUAN