Homogeneous linear diff eqns with constant coefficients.. 1..[r]
Trang 1Last lecture:
1 ARMA(p,q) models: stationarity, causality, invertibility
2 The linear process representation of ARMA processes: ψ
3 Autocovariance of an ARMA process
4 Homogeneous linear difference equations
Trang 2Introduction to Time Series Analysis Lecture 7.
Peter Bartlett
1 Review: ARMA(p,q) models and their properties
2 Review: Autocovariance of an ARMA process
3 Homogeneous linear difference equations
Forecasting
1 Linear prediction
2 Projection in Hilbert space
Trang 3An ARMA(p,q) process {Xt} is a stationary process that
Trang 4Review: Properties of ARMA(p,q) models
Theorem: If φ and θ have no common factors, a (unique)
Trang 6Review: Autocovariance functions of ARMA processes
We need to solve the homogeneous linear difference equation
φ(B)γ(h) = 0 (h > q), with initial conditions
Trang 72 Review: Autocovariance of an ARMA process.
3 Homogeneous linear difference equations
Forecasting
1 Linear prediction
2 Projection in Hilbert space
Trang 8Homogeneous linear diff eqns with constant coefficients
Trang 9a(B)xt = 0 ⇔ (B − z1)(B − z2) · · · (B − zk)xt = 0.
So any {xt} satisfying (B − zi)xt = 0 for some i also satisfies a(B)xt = 0
Three cases:
1 The zi are real and distinct
2 The zi are complex and distinct
3 Some zi are repeated
Trang 10Homogeneous linear diff eqns with constant coefficients
1 The zi are real and distinct.
Trang 111 The zi are real and distinct e.g., z1 = 1.2, z2 = −1.3
1 =0, c
2 =1 c
1 =−0.8, c
2 =−0.2
Trang 12Reminder: Complex exponentials
a + ib = reiθ = r(cos θ + i sin θ),
Trang 132 The zi are complex and distinct.
Trang 14Homogeneous linear diff eqns with constant coefficients
2 The zi are complex and distinct e.g., z1 = 1.2 + i, z2 = 1.2 − i
Trang 152 The zi are complex and distinct e.g., z1 = 1 + 0.1i, z2 = 1 − 0.1i
Trang 16Homogeneous linear diff eqns with constant coefficients
3 Some zi are repeated.
Trang 173 Some zi are repeated e.g., z1 = z2 = 1.5.
−1
−0.5 0 0.5 1 1.5
1 =0, c
2 =2 c
1 =−0.2, c
2 =−0.8
Trang 18Solving linear diff eqns with constant coefficients
where z1, z2, , zl ∈ C are the roots of the characteristic polynomial, and
zi occurs with multiplicity mi
Solutions: c1(t)z− t
1 + c2(t)z− t
2 + · · · + cl(t)z− t
l ,where ci(t) is a polynomial in t of degree mi − 1
We determine the coefficients of the ci(t) using the initial conditions
(which might be linear constraints on the initial values x1, , xk)
Trang 190 otherwise.
Trang 20Autocovariance functions of ARMA processes: Example
We have the homogeneous linear difference equation
γ(h) + 0.25γ(h − 2) = 0
for h ≥ 2, with initial conditions
γ(0) + 0.25γ(−2) = σw2 (1 + 1/25)γ(1) + 0.25γ(−1) = σw2 /5
Trang 21Homogeneous lin diff eqn:
which has roots at z1 = 2eiπ/2, ¯z1 = 2e− iπ/2
The solution is of the form
γ(h) = cz− h
1 + ¯c ¯z1− h
Trang 22Autocovariance functions of ARMA processes: Example
z1 = 2eiπ/2, ¯z1 = 2e− iπ/2, c = |c|eiθ
And we determine c1, θ from the initial conditions
γ(0) + 0.25γ(−2) = σw2 (1 + 1/25)γ(1) + 0.25γ(−1) = σw2 /5
Trang 23We determine c1, θ from the initial conditions:
We plug γ(0) = c1 cos(θ)
γ(1) = c1
2 sin(θ)γ(2) = −c1
4 cos(θ)
into γ(0) + 0.25γ(2) = σw2 (1 + 1/25)
1.25γ(1) = σw2 /5
Trang 24Autocovariance functions of ARMA processes: Example
−0.4
−0.2 0 0.2 0.4 0.6 0.8 1 1.2
γ(h)
Trang 252 Review: Autocovariance of an ARMA process.
3 Homogeneous linear difference equations
Forecasting
1 Linear prediction
2 Projection in Hilbert space
Trang 26Review: least squares linear prediction
Consider a linear predictor of Xn+h given Xn = xn:
Trang 27Given X1, X2, , Xn, the best linear predictor
Trang 29Hilbert space = complete inner product space:
Inner product space: vector space, with inner product ha, bi:
• ha, bi = hb, ai,
• hα1a1 + α2a2, bi = α1ha1, bi + α2ha2, bi,
• ha, ai = 0 ⇔ a = 0
Norm: kak2 = ha, ai
complete = limits of Cauchy sequences are in the space
Trang 30Projection theorem
Example: Linear regression
Given y = (y1, y2, , yn)′
∈ Rn, and Z = (z1, , zq) ∈ Rn×q,choose β = (β1, , βq)′
∈ Rq to minimize ky − Zβk2
Here, H = Rn, with ha, bi = P
i aibi, and
M = {Zβ : β ∈ Rq} = ¯sp{z1, , zq}
Trang 33Example: Linear prediction
Trang 35Let Xn+mn denote the best linear predictor:
E Xn+mn − Xn+m Xi = 0
That is, the prediction errors (Xn+mn − Xn+m) are uncorrelated with the
Trang 38One-step-ahead linear prediction
Trang 39φn = (φn1, φn2, , φnn)′
, γn = (γ(1), γ(2), , γ(n))′
Trang 40
Mean squared error of one-step-ahead linear prediction
Trang 41= Var(Xn+1) − Cov(Xn+1, X)Cov(X, X)− 1Cov(X, Xn+1)
= E (Xn+1 − 0)2 − Cov(Xn+1, X)Cov(X, X)− 1Cov(X, Xn+1),
where X = (Xn, Xn−1, , X1)′
Trang 42
Introduction to Time Series Analysis Lecture 7.
Peter Bartlett
1 Review: ARMA(p,q) models and their properties
2 Review: Autocovariance of an ARMA process
3 Homogeneous linear difference equations
Forecasting
1 Linear prediction
2 Projection in Hilbert space