1. Trang chủ
  2. » Kinh Doanh - Tiếp Thị

Time series econometrics (springer texts in business and economics)

408 20 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 408
Dung lượng 19,82 MB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

autoregressive models, the analysis of integrated and cointegrated time series, and models ofvolatility have been extremely fruitful and far-reaching areas of research.. The first part t

Trang 2

Springer Texts in Business and Economics

More information about this series at http://​www.​springer.​com/​series/​10099

Trang 3

Klaus Neusser

Time Series Econometrics

Trang 4

Library of Congress Control Number: 2016938514

© Springer International Publishing Switzerland 2016

Springer Texts in Business and Economics

This Springer imprint is published by Springer Nature

This work is subject to copyright All rights are reserved by the Publisher, whether the whole or part

of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations,recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission

or information storage and retrieval, electronic adaptation, computer software, or by similar or

dissimilar methodology now known or hereafter developed

The use of general descriptive names, registered names, trademarks, service marks, etc in this

publication does not imply, even in the absence of a specific statement, that such names are exemptfrom the relevant protective laws and regulations and therefore free for general use

The publisher, the authors and the editors are safe to assume that the advice and information in thisbook are believed to be true and accurate at the date of publication Neither the publisher nor theauthors or the editors give a warranty, express or implied, with respect to the material containedherein or for any errors or omissions that may have been made

Printed on acid-free paper

The registered company is Springer International Publishing AG Switzerland

Trang 5

autoregressive models, the analysis of integrated and cointegrated time series, and models of

volatility have been extremely fruitful and far-reaching areas of research With the award of the

Nobel Prizes to Clive W J Granger and Robert F Engle III in 2003 and to Thomas J Sargent andChristopher A Sims in 2011, the field has reached a certain degree of maturity Thus, the idea

suggests itself to assemble the vast amount of material scattered over many papers into a

comprehensive textbook

The book is self-contained and addresses economics students who have already some

prerequisite knowledge in econometrics It is thus suited for advanced bachelor, master’s, or

beginning PhD students but also for applied researchers The book tries to bring them in a position to

be able to follow the rapidly growing research literature and to implement these techniques on theirown Although the book is trying to be rigorous in terms of concepts, definitions, and statements oftheorems, not all proofs are carried out This is especially true for the more technically and lengthyproofs for which the reader is referred to the pertinent literature

The book covers approximately a two-semester course in time series analysis and is divided intwo parts The first part treats univariate time series, in particular autoregressive moving-averageprocesses Most of the topics are standard and can form the basis for a one-semester introductorytime series course This part also contains a chapter on integrated processes and on models of

volatility The latter topics could be included in a more advanced course The second part is devoted

to multivariate time series analysis and in particular to vector autoregressive processes It can betaught independently of the first part The identification, modeling, and estimation of these processesform the core of the second part A special chapter treats the estimation, testing, and interpretation ofcointegrated systems The book also contains a chapter with an introduction to state space models andthe Kalman filter Whereas the books is almost exclusively concerned with linear systems, the lastchapter gives a perspective on some more recent developments in the context of nonlinear models Ihave included exercises and worked out examples to deepen the teaching and learning content

Finally, I have produced five appendices which summarize important topics such as complex

numbers, linear difference equations, and stochastic convergence

As time series analysis has become a tremendously growing field with an active research in manydirections, it goes without saying that not all topics received the attention they deserved and that thereare areas not covered at all This is especially true for the recent advances made in nonlinear timeseries analysis and in the application of Bayesian techniques These two topics alone would justify anextra book

The data manipulations and computations have been performed using the software packages

EVIEWS and MATLAB 1 Of course, there are other excellent packages available The data for theexamples and additional information can be downloaded from my home page www.neusser.ch Tomaximize the learning success, it is advised to replicate the examples and to perform similar

Trang 6

exercises with alternative data Interesting macroeconomic time series can, for example, be

downloaded from the following home pages:

Germany: www.bundesbank.de

Switzerland: www.snb.ch

United Kingdom: www.statistics.gov.uk

United States: research.stlouisfed.org/fred2

The book grew out of lectures which I had the occasion to give over the years in Bern and otheruniversities Thus, it is a concern to thank the many students, in particular Philip Letsch, who had towork through the manuscript and who called my attention to obscurities and typos I also want to

thank my colleagues and teaching assistants Andreas Bachmann, Gregor Bäurle, Fabrice Collard,Sarah Fischer, Stephan Leist, Senada Nukic, Kurt Schmidheiny, Reto Tanner, and Martin Wagner forreading the manuscript or part of it and for making many valuable criticisms and comments Specialthanks go to my former colleague and coauthor Robert Kunst who meticulously read and commented

on the manuscript It goes without saying that all errors and shortcomings go to my expense

Klaus Neusser Bern, Switzerland/Eggenburg, Austria

February 2016

Trang 7

Notation and Symbols

number of linearly independent cointegration vectors

partial autocorrelation function of process { X t }

partial autocorrelation function

dimension of stochastic process, respectively dimension of state space

group of orthogonal n × n matrices

order of autoregressive polynomial

order of moving-average polynomial

Trang 8

integrated process of order d

vector autoregressive process of order p

identically and independently normally distributed random variables with mean

zero and variance σ 2

time indexed random variable

realization of random variable X t

Trang 9

1.​4.​2 Construction of Stochastic Processes:​ Some Examples

1.​4.​3 Moving-Average Process of Order One

1.​4.​4 RandomWalk

1.​4.​5 Changing Mean

1.​5 Properties of the Autocovariance Function

1.​5.​1 Autocovariance Function of MA(1) Processes

1.​6 Exercises

2 ARMA Models

2.​1 The Lag Operator

2.​2 Some Important Special Cases

2.2.1 Moving-Average Process of Order q

2.​2.​2 First Order Autoregressive Process

2.​3 Causality and Invertibility

2.​4 Computation of Autocovariance Function

2.​4.​1 First Procedure

Trang 10

2.​4.​2 Second Procedure

2.​4.​3 Third Procedure

2.​5 Exercises

3 Forecasting Stationary Processes

3.​1 Linear Least-Squares Forecasts

3.​1.​1 Forecasting with an AR(p) Process

3.​1.​2 Forecasting with MA(q) Processes

3.​1.​3 Forecasting from the Infinite Past

3.​2 The Wold Decomposition Theorem

4 Estimation of Mean and ACF

4.​1 Estimation of the Mean

5 Estimation of ARMA Models

5.​1 The Yule-Walker Estimator

Trang 11

5.​2 OLS Estimation of an AR(p) Model

5.​3 Estimation of an ARMA(p,q) Model

5.4 Estimation of the Orders p and q

5.​5 Modeling a Stochastic Process

5.​6 Modeling Real GDP of Switzerland

6 Spectral Analysis and Linear Filters

6.​1 Spectral Density

6.​2 Spectral Decomposition of a Time Series

6.​3 The Periodogram and the Estimation of Spectral Densities

6.​3.​1 Non-Parametric Estimation

6.​3.​2 Parametric Estimation

6.​4 Linear Time-Invariant Filters

6.​5 Some Important Filters

6.​5.​1 Construction of Low- and High-Pass Filters

6.​5.​2 The Hodrick-Prescott Filter

7.​1.​2 Variance of Forecast Error

7.​1.​3 Impulse Response Function

7.​1.​4 The Beveridge-Nelson Decomposition

Trang 12

7.​2 Properties of the OLS Estimator in the Case of Integrated Variables

7.​3 Unit-Root Tests

7.​3.​1 Dickey-Fuller Test

7.​3.​2 Phillips-Perron Test

7.​3.​3 Unit-Root Test:​ Testing Strategy

7.​3.​4 Examples of Unit-Root Tests

7.​4 Generalizations of Unit-Root Tests

7.​4.​1 Structural Breaks in the Trend Function

7.​4.​2 Testing for Stationarity

7.​5 Regression with Integrated Variables

7.​5.​1 The Spurious Regression Problem

7.​5.​2 Bivariate Cointegration

7.​5.​3 Rules to Deal with Integrated Times Series

8 Models of Volatility

8.​1 Specification and Interpretation

8.​1.​1 Forecasting Properties of AR(1)-Models

8.​1.​2 The ARCH(1) Model

8.​1.​3 General Models of Volatility

8.​1.​4 The GARCH(1,1) Model

8.​2 Tests for Heteroskedastici​ty

8.​2.​1 Autocorrelation of Quadratic Residuals

8.​2.​2 Engle’s Lagrange-Multiplier Test

8.​3 Estimation of GARCH(p,q) Models

8.​3.​1 Maximum-Likelihood Estimation

Trang 13

8.​3.​2 Method of Moment Estimation

8.​4 Example:​ Swiss Market Index (SMI)

Part II Multivariate Time Series Analysis

9 Introduction

10 Definitions and Stationarity

11 Estimation of Covariance Function

11.​1 Estimators and Asymptotic Distributions

11.​2 Testing Cross-Correlations of Time Series

11.​3 Some Examples for Independence Tests

12 VARMA Processes

12.​1 The VAR(1) Process

12.​2 Representation in Companion Form

12.​3 Causal Representation

12.​4 Computation of Covariance Function

13 Estimation of VAR Models

13.​1 Introduction

13.​2 The Least-Squares Estimator

13.​3 Proofs of Asymptotic Normality

13.​4 The Yule-Walker Estimator

14 Forecasting with VAR Models

14.​1 Forecasting with Known Parameters

14.​1.​1 Wold Decomposition Theorem

14.​2 Forecasting with Estimated Parameters

14.​3 Modeling of VAR Models

Trang 14

14.​4 Example:​ VAR Model

15 Interpretation of VAR Models

15.​2.​2 Identification:​ The General Case

15.2.3 Identification: The Case n  = 2

15.​3 Identification via Short-Run Restrictions

15.​4 Interpretation of VAR Models

15.​4.​1 Impulse Response Functions

15.​4.​2 Variance Decomposition

15.​4.​3 Confidence Intervals

15.​4.​4 Example 1:​ Advertisement and Sales

15.​4.​5 Example 2:​ IS-LM Model with Phillips Curve

15.​5 Identification via Long-Run Restrictions

Trang 15

16.​2.​1 Definition

16.​2.​2 VAR and VEC Models

16.​2.​3 Beveridge-Nelson Decomposition

16.​2.​4 Common Trend Representation

16.​3 Johansen’s Cointegration Test

16.​3.​1 Specification of the Deterministic Components

16.​3.​2 Testing Cointegration Hypotheses

16.​4 Estimation and Testing of Cointegrating Relationships

16.​5 An Example

17 Kalman Filter

17.​1 The State Space Model

17.​1.​1 Examples

17.​2 Filtering and Smoothing

17.​2.​1 The Kalman Filter

17.​2.​2 The Kalman Smoother

17.​3 Estimation of State Space Models

17.​3.​1 The Likelihood Function

Trang 17

List of Figures

Fig 1.1 Real gross domestic product (GDP)

Fig 1.2 Growth rate of real gross domestic product (GDP)

Fig 1.3 Swiss real gross domestic product

Fig 1.4 Short- and long-term Swiss interest rates

Fig 1.5 Swiss Market Index (SMI) ( a ) Index ( b ) Daily return

Fig 1.6 Unemployment rate in Switzerland

Fig 1.7 Realization of a random walk

Fig 1.8 Realization of a branching process

Fig 1.9 Processes constructed from a given white noise process ( a ) White noise ( b )

Moving-average with θ  = 0. 9 ( c ) Autoregressive with ϕ  = 0. 9 ( d ) Random walk

Fig 1.10 Relation between the autocorrelation coefficient of order one, ρ (1), and the parameter θ of

a MA(1) process

Fig 2.1 Realization and estimated ACF of MA(1) process

Fig 2.2 Realization and estimated ACF of an AR(1) process

Fig 2.3 Autocorrelation function of an ARMA(2,1) process

Trang 18

Fig 3.1 Autocorrelation and partial autocorrelation functions ( a ) Process 1 ( b ) Process 2 ( c ) Process 3 ( d ) Process 4

Fig 4.1 Estimated autocorrelation function of a WN(0,1) process

Fig 4.2 Estimated autocorrelation function of MA(1) process

Fig 4.3 Estimated autocorrelation function of an AR(1) process

Fig 4.4 Estimated PACF of an AR(1) process

Fig 4.5 Estimated PACF for a MA(1) process

Fig 4.6 Common kernel functions

Fig 4.7 Estimated autocorrelation function for the growth rate of GDP

Fig 5.1 Parameter space of causal and invertible ARMA(1,1) process

Fig 5.2 Real GDP growth rates of Switzerland

Fig 5.3 ACF and PACF of GDP growth rate

Fig 5.4 Inverted roots of the ARMA(1,3) model

Fig 5.5 ACF of the residuals from AR(2) and ARMA(1,3) models

Fig 5.6 Impulse responses of the AR(2) and the ARMA(1,3) model

Fig 5.7 Forecasts of real GDP growth rates

Trang 19

Fig 6.1 Examples of spectral densities with Z t  ∼ WN(0, 1) ( a ) MA(1) process ( b ) AR(1)

process

Fig 6.2 Raw periodogram of a white noise time series ( X t  ∼ WN(0, 1), T  = 200)

Fig 6.3 Raw periodogram of an AR(2) process ( X t  = 0. 9 X t −1 − 0. 7 X t −2 + Z t with Z t  ∼ WN(0, 

1), T  = 200)

Fig 6.4 Non-parametric direct estimates of a spectral density

Fig 6.5 Nonparametric and parametric estimates of spectral density

Fig 6.6 Transfer function of the Kuznets filters

Fig 6.7 Transfer function of HP-filter

Fig 6.8 HP-filtered US GDP

Fig 6.9 Transfer function of growth rate of investment in the construction sector with and withoutseasonal adjustment

Fig 7.1 Distribution of the OLS estimator

Fig 7.2 Distribution of t-statistic and standard normal distribution

Fig 7.3 ACF of a random walk with 100 observations

Fig 7.4 Three types of structural breaks at T B ( a ) Level shift ( b ) Change in slope ( c ) Level

shift and change in slope

Trang 20

Fig 7.5 Distribution of OLS-estimate and t-statistic for two independent random walks and two

independent AR(1) processes ( a ) Distribution of ( b ) Distribution of ( c ) Distribution of

and t-statistic

Fig 7.6 Cointegration of inflation and three-month LIBOR ( a ) Inflation and three-month LIBOR ( b

) Residuals from cointegrating regression

Fig 8.1 Simulation of two ARCH(1) processes

Fig 8.2 Parameter region for which a strictly stationary solution to the GARCH(1,1) process exists

assuming ν t  ∼ IID + N(0, 1)

Fig 8.3 Daily return of the SMI (Swiss Market Index)

Fig 8.4 Normal-Quantile Plot of SMI returns

Fig 8.5 Histogram of SMI returns

Fig 8.6 ACF of the returns and the squared returns of the SMI

Fig 11.1 Cross-correlations between two independent AR(1) processes

Fig 11.2 Cross-correlations between consumption and advertisement

Fig 11.3 Cross-correlations between GDP and consumer sentiment

Fig 14.1 Forecast comparison of alternative models ( a ) log Y t ( b ) log P t ( c ) log M t ( d ) R t

Trang 21

Fig 14.2 Forecast of VAR(8) model and 80% confidence intervals

Fig 15.1 Identification in a two-dimensional structural VAR

Fig 15.2 Impulse response functions for advertisement and sales

Fig 15.3 Impulse response functions of IS-LM model

Fig 15.4 Impulse response functions of the Blanchard-Quah model

Fig 16.1 Impulse responses of present discounted value model

Fig 16.2 Stochastic simulation of present discounted value model

Fig 17.1 State space model

Fig 17.2 Spectral density of cyclical component

Fig 17.3 Estimates of quarterly GDP growth rates

Fig 17.4 Components of the basic structural model (BSM) for real GDP of Switzerland ( a ) Logged Swiss GDP (demeaned) ( b ) Local linear trend (LLT) ( c ) Business cycle component ( d )

Seasonal component

Fig 18.1 Break date UK

Fig A.1 Representation of a complex number

Trang 22

List of Tables

Table 1.1 Construction of stochastic processes

Table 3.1 Forecast function for a MA(1) process with θ  = −0. 9 and σ 2  = 1

Table 3.2 Properties of the ACF and the PACF

Table 4.1 Common kernel functions

Table 5.1 AIC for alternative ARMA(p,q) models

Table 5.2 BIC for alternative ARMA(p,q) models

Table 7.1 The four most important cases for the unit-root test

Table 7.2 Examples of unit root tests

Table 7.3 Dickey-Fuller regression allowing for structural breaks

Table 7.4 Critical values of the KPSS test

Table 7.5 Rules of thumb in regressions with integrated processes

Table 8.1 AIC criterion for variance equation in GARCH(p,q) model

Table 8.2 BIC criterion for variance equation in GARCH(p,q) model

Table 8.3 One percent VaR for the next day of the return on SMI

Trang 23

Table 8.4 One percent VaR for the next 10 days of the return on SMI

Table 14.1 Information criteria for the VAR models of different orders

Table 14.2 Forecast evaluation of alternative VAR models

Table 15.1 Forecast error variance decomposition (FEVD) in terms of demand, supply, price, wage,and money shocks (percentages)

Table 16.1 Trend specifications in vector error correction models

Table 16.2 Evaluation of the results of Johansen’s cointegration test

Trang 24

3.2 Partial Autocorrelation Function I

3.3 Partial Autocorrelation Function II

6.1 Spectral Density

Trang 26

C.7 m-Dependence

Trang 27

List of Theorems

3.1 Wold Decomposition

4.1 Convergence of Arithmetic Average

4.2 Asymptotic Distribution of Sample Mean

4.4 Asymptotic Distribution of Autocorrelations

5.1 Asymptotic Normality of Yule-Walker Estimator

5.2 Asymptotic Normality of the Least-Squares Estimator

5.3 Asymptotic Distribution of ML Estimator

6.1 Properties of a Spectral Density

6.2 Spectral Representation

6.3 Spectral Density of ARMA Processes

6.4 Autocovariance Function of Filtered Process

7.1 Beveridge-Nelson Decomposition

13.1 Asymptotic Distribution of OLS Estimator

16.1 Beveridge-Nelson Decomposition

Trang 28

C.11 Convergence of Characteristic Functions, Lévy

C.12 Central Limit Theorem

C.13 CLT for m-Dependent Processes

C.14 Basis Approximation Theorem

Trang 29

Footnotes

EVIEWS is a product of IHS Global Inc MATLAB is a matrix-oriented software developed by MathWorks which is ideally suited for econometric and time series applications.

Trang 30

Part I

Univariate Time Series Analysis

Trang 31

© Springer International Publishing Switzerland 2016

Klaus Neusser, Time Series Econometrics, Springer Texts in Business and Economics, DOI 10.1007/978-3-319-32862-1_1

1 Introduction and Basic Theoretical Concepts

Klaus Neusser1

Bern, Switzerland

Time series analysis is an integral part of every empirical investigation which aims at describing andmodeling the evolution over time of a variable or a set of variables in a statistically coherent way.The economics of time series analysis is thus very much intermingled with macroeconomics and

finance which are concerned with the construction of dynamic models In principle, one can approach

the subject from two complementary perspectives The first one focuses on descriptive statistics It

characterizes the empirical properties and regularities using basic statistical concepts like mean,variance, and covariance These properties can be directly measured and estimated from the data

using standard statistical tools Thus, they summarize the external (observable) or outside

characteristics of the time series The second perspective tries to capture the internal data

generating mechanism This mechanism is usually unknown in economics as the models developed

in economic theory are mostly of a qualitative nature and are usually not specific enough to single out

a particular mechanism.1 Thus, one has to consider some larger class of models By far most widelyused is the class of autoregressive moving-average (ARMA) models which rely on linear stochasticdifference equations with constant coefficients Of course, one wants to know how the two

perspectives are related which leads to the important problem of identifying a model from the data.

The observed regularities summarized in the form of descriptive statistics or as a specific modelare, of course, of principal interest to economics They can be used to test particular theories or touncover new features One of the main assumptions underlying time series analysis is that the

regularities observed in the sample period are not specific to that period, but can be extrapolated intothe future This leads to the issue of forecasting which is another major application of time seriesanalysis

Although its roots lie in the natural sciences and in engineering, time series analysis, since theearly contributions by Frisch (1933) and Slutzky (1937), has become an indispensable tool in

empirical economics Early applications mostly consisted in making the knowledge and methodsacquired there available to economics However, with the progression of econometrics as a separatescientific field, more and more techniques that are specific to the characteristics of economic datahave been developed I just want to mention the analysis of univariate and multivariate integrated,respectively cointegrated time series (see Chaps 7 and 16), the identification of vector

autoregressive (VAR) models (see Chap 15), and the analysis of volatility of financial market data inChap 8 Each of these topics alone would justify the treatment of time series analysis in economics as

a separate subfield

Trang 32

1.1 Some Examples

Before going into more formal analysis, it is useful to examine some prototypical economic timeseries by plotting them against time This simple graphical inspection already reveals some of theissues encountered in this book One of the most popular time series is the real gross domestic

product Figure 1.1 plots the data for the U.S from 1947 first quarter to 2011 last quarter on

logarithmic scale Several observations are in order First, the data at hand cover just a part of the

time series There are data available before 1947 and there will be data available after 2011 Asthere is no natural starting nor end point, we think of a time series as extending back into the infinite

past and into the infinite future Second, the observations are treated as the realizations of a random

mechanism This implies that we observe only one realization If we could turn back time and let runhistory again, we would obtain a second realization This is, of course, impossible, at least in themacroeconomics context Thus, typically, we are faced with just one realization on which to base ouranalysis However, sound statistical analysis needs many realizations This implies that we have tomake some assumption on the constancy of the random mechanism over time This leads to the

concept of stationarity which will be introduced more rigorously in the next section Third, even a cursory look at the plot reveals that the mean of real GDP is not constant, but is upward trending As

we will see, this feature is typical of many economic time series.2 The investigation into the nature ofthe trend and the statistical consequences thereof have been the subject of intense research over the

last couple of decades Fourth, a simple way to overcome this problem is to take first differences.

As the data have been logged, this amounts to taking growth rates.3 The corresponding plot is given inFig 1.2 which shows no trend anymore

Fig 1.1 Real gross domestic product (GDP) of the U.S (chained 2005 dollars; seasonally adjusted annual rate)

Trang 33

Fig 1.2 Quarterly growth rate of U.S real gross domestic product (GDP) (chained 2005 dollars)

Another feature often encountered in economic time series is seasonality This issue arises, for

example in the case of real GDP, because of a particular regularity within a year: the first quarterbeing the quarter with the lowest values, the second and fourth quarter those with the highest values,and the third quarter being in between These movements are due to climatical and holiday seasonalvariations within the year and are viewed to be of minor economic importance Moreover, theseseasonal variations, because of there size, hide the more important business cycle movements It istherefore customary to work with time series which have been adjusted for seasonality before hand.Figure 1.3 shows the unadjusted and the adjusted real gross domestic product for Switzerland Theadjustment has been achieved by taking a moving-average This makes the time series much smootherand evens out the seasonal movements

Fig 1.3 Comparison of unadjusted and seasonally adjusted Swiss real gross domestic product (GDP)

Other typical economic time series are interest rates plotted in Fig 1.4 Over the period

considered these two variables also seem to trend However, the nature of this trend must be differentbecause of the theoretically binding zero lower bound Although the relative level of the two serieschanges over time—at the beginning of the sample, short-term rates are higher than long-terms ones—

Trang 34

they move more or less together This comovement is true in particular true with respect to the

medium- and long-term

Fig 1.4 Short- and long-term Swiss interest rates (three-month LIBOR and 10 year government bond)

Other prominent time series are stock market indices In Fig 1.5 the Swiss Market Index (SMI) inplotted as an example The first panel displays the raw data on a logarithmic scale One can clearlydiscern the different crises: the internet bubble in 2001 and the most recent financial market crisis in

2008 More interesting than the index itself is the return on the index plotted in the second panel.Whereas the mean seems to stay relatively constant over time, the volatility is not: in the periods of

crisis volatility is much higher This clustering of volatility is a typical feature of financial market

data and will be analyzed in detail in Chap 8

Trang 35

Fig 1.5 Swiss Market Index (SMI) (a ) Index (b ) Daily return

Finally, Fig 1.6 plots the unemployment rate for Switzerland This is another widely discussedtime series However, the Swiss data have a particular feature in that the behavior of the series

changes over time Whereas unemployment was practically nonexistent in Switzerland up to the end

of 1990’s, several policy changes (introduction of unemployment insurance, liberalization of

immigration laws) led to drastic shifts Although such dramatic structural breaks are rare, one has to

be always aware of such a possibility Reasons for breaks are policy changes and simply structuralchanges in the economy at large.4

Trang 36

Fig 1.6 Unemployment rate in Switzerland

1.2 Formal Definitions

The previous section attempted to give an intuitive approach of the subject The analysis to follownecessitates, however, more precise definitions and concepts At the heart of the exposition stands the

concept of a stochastic process For this purpose we view the observation at some time t as the

realization of random variable X t In time series analysis we are, however, in general not interested

in a particular point in time, but rather in a whole sequence This leads to the following definition

Definition 1.1.

A stochastic process {X t } is a family of random variables indexed by and defined on some

given probability space.

Thereby denotes an ordered index set which is typically identified with time In the literature onecan encounter the following index sets:

Remark 1.1.

Given that is identified with time and thus has a direction, a characteristic of time series analysis

is the distinction between past, present, and future

For technical reasons which will become clear later, we will work with , the set of integers.This choice is consistent with the use of time indices in economics as there is, usually, no naturalstarting point nor a foreseeable endpoint Although models in continuous time are well established inthe theoretical finance literature, we will disregard them because observations are always of a

Trang 37

discrete nature and because models in continuous time would need substantially higher mathematicalrequirements.

Remark 1.2.

The random variables {X t } take values in a so-called state space In the first part of this treatise, we

take as the state space the space of real numbers and thus consider only univariate time series Inpart II we extend the state space to and study multivariate times series Theoretically, it is

possible to consider other state spaces (for example, {0, 1}, the integers, or the complex numbers),but this will not be pursued here

Definition 1.2.

The function which assigns to each point in time t the realization of the random variable X t

, x t , is called a realization or a trajectory of the stochastic process We denote such a realization

by {x t }.

We denominate by a time series the realization or trajectory (observations or data), or the underlying

stochastic process Usually, there is no room for misunderstandings A trajectory therefore representsone observation of the stochastic process Whereas in standard statistics a sample consists of several,typically, independent draws from the same distribution, a sample in time series analysis is just onetrajectory Thus, we are confronted with a situation where there is in principle just one observation

We cannot turn back the clock and get additional trajectories The situation is even worse as we

typically observe only the realizations in a particular time window For example, we might have data

on US GDP from the first quarter 1960 up to the last quarter in 2011 But it is clear, the United Statesexisted before 1960 and will continue to exist after 2011, so that there are in principle observationsbefore 1960 and after 2011 In order to make a meaningful statistical analysis, it is therefore

necessary to assume that the observed part of the trajectory is typical for the time series as a whole

This idea is related to the concept of stationarity which we will introduce more formally below In

addition, we want to require that the observations cover in principle all possible events This leads to

the concept of ergodicity We avoid a formal definition of ergodicity as this would require a

sizeable amount of theoretical probabilistic background material which goes beyond the scope thistreatise.5

An important goal of time series analysis is to build a model given the realization (data) at hand

This amounts to specify the joint distribution of some set of X t ’s with corresponding realization {x t

}

Definition 1.3 (Model).

A time series model or a model for the observations (data) {x t } is a specification of the joint

distribution of {X t } for which {x t } is a realization.

The Kolmogorov existence theorem ensures that the specification of all finite dimensional

distributions is sufficient to characterize the whole stochastic process (see Billingsley (1986),

Brockwell and Davis (1991), or Kallenberg (2002))

Most of the time it is too involved to specify the complete distribution so that one relies on only

Trang 38

the first two moments These moments are then given by the means , the variances , , and

, If the random variables are jointly normally distributedthen the specification of the first two moments is sufficient to characterize the whole distribution

Examples of Stochastic Processes

{X t } is a sequence of independently distributed random variables with values in { − 1, 1} such

head occurs one gets a Euro whereas if tail occurs one has to pay a Euro

The simple random walk {S t } is defined by

where {X t } is the process from the example just above In this case S t is the proceeds after t rounds of coin tossing More generally, {X t } could be any sequence of identically and

independently distributed random variables Figure 1.7 shows a realization of {X t } for t = 1, 

2, …, 100 and the corresponding random walk {S t } For more on random walks see Sect 1.4.4

and, in particular, Chap 7

Fig 1.7 Realization of a random walk

The simple branching process is defined through the recursion

In this example X t represents the size of a population where each member lives just one period

Trang 39

and reproduces itself with some probability Z t, j thereby denotes the number of offsprings of the

j-th member of the population in period t In the simplest case {Z t, j } is nonnegative integer

valued and identically and independently distributed A realization with X 0 = 100 and withprobabilities of one third each that the member has no, one, or two offsprings is shown as anexample in Fig 1.8

Fig 1.8 Realization of a branching process

1.3 Stationarity

An important insight in time series analysis is that the realizations in different periods are relatedwith each other The value of GDP in some year obviously depends on the values from previousyears This temporal dependence can be represented either by an explicit model or, in a descriptive

way, by covariances, respectively correlations Because the realization of X t in some year t may

depend, in principle, on all past realizations , we do not have to specify just a finite

number of covariances, but infinitely many covariances This leads to the concept of the covariance

function The covariance function is not only a tool for summarizing the statistical properties of a

time series, but is also instrumental in the derivation of forecasts (Chap 3), in the estimation of

ARMA models, the most important class of models (Chap 5), and in the Wold representation

(Sect 3.​2 in Chap 3) It is therefore of utmost importance to get a thorough understanding of the

meaning and properties of the covariance function

Definition 1.4 (Autocovariance Function).

Let {X t } be a stochastic process with for all then the function which assigns to any

two time periods t and s, , the covariance between X t and X s is called the autocovariance function of {X t } The autocovariance function is denoted by γ X (t,s) Formally this function is

given by

Trang 40

Remark 1.5.

Remark 1.6.

If {X t } is stationary, by setting the autocovariance function becomes:

Thus the covariance γ X (t, s) does not depend on the points in time t and s, but only on the number of periods t and s are apart from each other, i.e from t − s For stationary processes it is therefore

possible to view the autocovariance function as a function of just one argument We denote the

autocovariance function in this case by γ X (h), Because the covariance is symmetric in t and s, i.e γ X (t, s) = γ X (s, t), we have

It is thus sufficient to look at the autocovariance function for positive integers only, i.e for h = 0, 1, 

2, … In this case we refer to h as the order of the autocovariance For h = 0, we get the unconditional variance of X t , i.e

Ngày đăng: 03/01/2020, 16:10

TỪ KHÓA LIÊN QUAN

🧩 Sản phẩm bạn có thể quan tâm