1. Trang chủ
  2. » Mẫu Slide

advanced engineering mathematics – mathematics

42 4 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 42
Dung lượng 140,33 KB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

Homogeneous linear diff eqns with constant coefficients.. 1..[r]

Trang 1

Last lecture:

1 ARMA(p,q) models: stationarity, causality, invertibility

2 The linear process representation of ARMA processes: ψ

3 Autocovariance of an ARMA process

4 Homogeneous linear difference equations

Trang 2

Introduction to Time Series Analysis Lecture 7.

Peter Bartlett

1 Review: ARMA(p,q) models and their properties

2 Review: Autocovariance of an ARMA process

3 Homogeneous linear difference equations

Forecasting

1 Linear prediction

2 Projection in Hilbert space

Trang 3

An ARMA(p,q) process {Xt} is a stationary process that

Trang 4

Review: Properties of ARMA(p,q) models

Theorem: If φ and θ have no common factors, a (unique)

Trang 6

Review: Autocovariance functions of ARMA processes

We need to solve the homogeneous linear difference equation

φ(B)γ(h) = 0 (h > q), with initial conditions

Trang 7

2 Review: Autocovariance of an ARMA process.

3 Homogeneous linear difference equations

Forecasting

1 Linear prediction

2 Projection in Hilbert space

Trang 8

Homogeneous linear diff eqns with constant coefficients

Trang 9

a(B)xt = 0 ⇔ (B − z1)(B − z2) · · · (B − zk)xt = 0.

So any {xt} satisfying (B − zi)xt = 0 for some i also satisfies a(B)xt = 0

Three cases:

1 The zi are real and distinct

2 The zi are complex and distinct

3 Some zi are repeated

Trang 10

Homogeneous linear diff eqns with constant coefficients

1 The zi are real and distinct.

Trang 11

1 The zi are real and distinct e.g., z1 = 1.2, z2 = −1.3

1 =0, c

2 =1 c

1 =−0.8, c

2 =−0.2

Trang 12

Reminder: Complex exponentials

a + ib = reiθ = r(cos θ + i sin θ),

Trang 13

2 The zi are complex and distinct.

Trang 14

Homogeneous linear diff eqns with constant coefficients

2 The zi are complex and distinct e.g., z1 = 1.2 + i, z2 = 1.2 − i

Trang 15

2 The zi are complex and distinct e.g., z1 = 1 + 0.1i, z2 = 1 − 0.1i

Trang 16

Homogeneous linear diff eqns with constant coefficients

3 Some zi are repeated.

Trang 17

3 Some zi are repeated e.g., z1 = z2 = 1.5.

−1

−0.5 0 0.5 1 1.5

1 =0, c

2 =2 c

1 =−0.2, c

2 =−0.8

Trang 18

Solving linear diff eqns with constant coefficients

where z1, z2, , zl ∈ C are the roots of the characteristic polynomial, and

zi occurs with multiplicity mi

Solutions: c1(t)z− t

1 + c2(t)z− t

2 + · · · + cl(t)z− t

l ,where ci(t) is a polynomial in t of degree mi − 1

We determine the coefficients of the ci(t) using the initial conditions

(which might be linear constraints on the initial values x1, , xk)

Trang 19

0 otherwise.

Trang 20

Autocovariance functions of ARMA processes: Example

We have the homogeneous linear difference equation

γ(h) + 0.25γ(h − 2) = 0

for h ≥ 2, with initial conditions

γ(0) + 0.25γ(−2) = σw2 (1 + 1/25)γ(1) + 0.25γ(−1) = σw2 /5

Trang 21

Homogeneous lin diff eqn:

which has roots at z1 = 2eiπ/2, ¯z1 = 2e− iπ/2

The solution is of the form

γ(h) = cz− h

1 + ¯c ¯z1− h

Trang 22

Autocovariance functions of ARMA processes: Example

z1 = 2eiπ/2, ¯z1 = 2e− iπ/2, c = |c|eiθ

And we determine c1, θ from the initial conditions

γ(0) + 0.25γ(−2) = σw2 (1 + 1/25)γ(1) + 0.25γ(−1) = σw2 /5

Trang 23

We determine c1, θ from the initial conditions:

We plug γ(0) = c1 cos(θ)

γ(1) = c1

2 sin(θ)γ(2) = −c1

4 cos(θ)

into γ(0) + 0.25γ(2) = σw2 (1 + 1/25)

1.25γ(1) = σw2 /5

Trang 24

Autocovariance functions of ARMA processes: Example

−0.4

−0.2 0 0.2 0.4 0.6 0.8 1 1.2

γ(h)

Trang 25

2 Review: Autocovariance of an ARMA process.

3 Homogeneous linear difference equations

Forecasting

1 Linear prediction

2 Projection in Hilbert space

Trang 26

Review: least squares linear prediction

Consider a linear predictor of Xn+h given Xn = xn:

Trang 27

Given X1, X2, , Xn, the best linear predictor

Trang 29

Hilbert space = complete inner product space:

Inner product space: vector space, with inner product ha, bi:

• ha, bi = hb, ai,

• hα1a1 + α2a2, bi = α1ha1, bi + α2ha2, bi,

• ha, ai = 0 ⇔ a = 0

Norm: kak2 = ha, ai

complete = limits of Cauchy sequences are in the space

Trang 30

Projection theorem

Example: Linear regression

Given y = (y1, y2, , yn)′

∈ Rn, and Z = (z1, , zq) ∈ Rn×q,choose β = (β1, , βq)′

∈ Rq to minimize ky − Zβk2

Here, H = Rn, with ha, bi = P

i aibi, and

M = {Zβ : β ∈ Rq} = ¯sp{z1, , zq}

Trang 33

Example: Linear prediction

Trang 35

Let Xn+mn denote the best linear predictor:

E Xn+mn − Xn+m Xi = 0

That is, the prediction errors (Xn+mn − Xn+m) are uncorrelated with the

Trang 38

One-step-ahead linear prediction

Trang 39

φn = (φn1, φn2, , φnn)′

, γn = (γ(1), γ(2), , γ(n))′

Trang 40

Mean squared error of one-step-ahead linear prediction

Trang 41

= Var(Xn+1) − Cov(Xn+1, X)Cov(X, X)− 1Cov(X, Xn+1)

= E (Xn+1 − 0)2 − Cov(Xn+1, X)Cov(X, X)− 1Cov(X, Xn+1),

where X = (Xn, Xn−1, , X1)′

Trang 42

Introduction to Time Series Analysis Lecture 7.

Peter Bartlett

1 Review: ARMA(p,q) models and their properties

2 Review: Autocovariance of an ARMA process

3 Homogeneous linear difference equations

Forecasting

1 Linear prediction

2 Projection in Hilbert space

Ngày đăng: 09/03/2021, 07:19