Palgrave handbook of econometrics volume 2 applied econometrics Palgrave handbook of econometrics volume 2 applied econometrics Palgrave handbook of econometrics volume 2 applied econometrics Palgrave handbook of econometrics volume 2 applied econometrics Palgrave handbook of econometrics volume 2 applied econometrics v Palgrave handbook of econometrics volume 2 applied econometrics Palgrave handbook of econometrics volume 2 applied econometrics Palgrave handbook of econometrics volume 2 applied econometrics
Trang 4Palgrave Handbook of Econometrics
Volume 2: Applied Econometrics
Edited By
Terence C Mills
and
Kerry Patterson
Trang 5Individual chapters © Contributors 2009
All rights reserved No reproduction, copy or transmission of this
publication may be made without written permission.
No paragraph of this publication may be reproduced, copied or transmitted save with written permission or in accordance with the provisions of the Copyright, Designs and Patents Act 1988, or under the terms of any licence permitting limited copying issued by the Copyright Licensing Agency,
Saffron House, 6–10 Kirby Street, London EC1N 8TS.
Any person who does any unauthorised act in relation to this publication may be liable to criminal prosecution and civil claims for damages.
The author has asserted his right to be identified
as the author of this work in accordance with the Copyright, Designs
and Patents Act 1988.
First published in 2009 by
PALGRAVE MACMILLAN
PALGRAVE MACMILLAN in the UK is an imprint of Macmillan Publishers Limited, registered in England, company number 785998, of Houndmills, Basingstoke, Hampshire RG21 6XS.
PALGRAVE MACMILLAN in the US is a division of St Martin’s Press LLC,
175 Fifth Avenue, New York, NY 10010.
PALGRAVE MACMILLAN is the global academic imprint of the above companies and has companies and representatives throughout the world.
Palgrave® and Macmillan® are registered trademarks in the United States, the United Kingdom, Europe and other countries.
ISBN: 978–1–4039–1799–7 hardback
This book is printed on paper suitable for recycling and made from fully managed and sustained forest sources Logging, pulping and manufacturing processes are expected to conform to the environmental regulations of the country of origin.
A catalogue record for this book is available from the British Library.
A catalog record for this book is available from the Library of Congress.
18 17 16 15 14 13 12 11 10 09
Printed and bound in Great Britain by
CPI Antony Rowe, Chippenham and Eastbourne
Trang 6Part I The Methodology and Philosophy of Applied Econometrics
1 The Methodology of Empirical Econometric Modeling: Applied
David F Hendry, Nuffield College, Oxford University
Fabio Canova, Universitat Pompeu Fabra
3 Introductory Remarks on Metastatistics for the Practically Minded
John DiNardo, University of Michigan
Part II Forecasting
Michael P Clements, Warwick University, and David I Harvey,
School of Economics, University of Nottingham
Stephen G Hall, University of Leicester, and James Mitchell,
National Institute of Economic and Social Research
Part III Time Series Applications
D.S.G Pollock, University of Leicester
7 Economic Cycles: Asymmetries, Persistence, and Synchronization 308
Joe Cardinale, Air Products and Chemicals, Inc., and Larry W Taylor,
College of Business and Economics, Lehigh University
8 The Long Swings Puzzle: What the Data Tell When Allowed to
Katarina Juselius, University of Copenhagen
9 Structural Time Series Models for Business Cycle Analysis 385
Tommaso Proietti, University of Rome ‘Tor Vergata’
10 Fractional Integration and Cointegration: An Overview and an
Luis A Gil-Alana and Javier Hualde, Universidad de Navarra
v
Trang 7Part IV Cross-section and Panel Data Applications
William Greene, Stern School of Business, New York University
12 Panel Data Methods and Applications to Health Economics 557
Andrew M Jones, University of York
13 Panel Methods to Test for Unit Roots and Cointegration 632
Anindya Banerjee, University of Birmingham, and Martin Wagner,
Institute forAdvanced Studies, Vienna
Part V Microeconometrics
14 Microeconometrics: Current Methods and Some Recent Developments 729
A Colin Cameron, University of California, Davis
15 Computational Considerations in Empirical Microeconometrics:
David T Jacho-Chávez and Pravin K Trivedi, Indiana University
Part VI Applications of Econometrics to Economic Policy
16 The Econometrics of Monetary Policy: An Overview 821
Carlo Favero, IGIER-Bocconi University
Gunnar Bårdsen, Norwegian University of Science and Technology,
and Ragnar Nymoen, University of Oslo
18 Monetary Policy, Beliefs, Unemployment and Inflation: Evidence
S.G.B Henry, National Institute of Economic and Social Research
Part VII Applications to Financial Econometrics
19 Estimation of Continuous-Time Stochastic Volatility Models 951
George Dotsis, Essex Business School, University of Essex,
Raphael N Markellos, Athens University of Economics and Business,
and Terence C Mills, Loughborough University
J Carlos Escanciano, Indiana University, and Ignacio N Lobato,
Instituto Tecnológico Autónomo de Mexico
21 Autoregressive Conditional Duration Models 1004
Ruey S Tsay, Booth Business School, University of Chicago
Efthymios G Pavlidis, Ivan Paya, and David A Peel,
Lancaster University Management School
Trang 8Part VIII Growth Development Econometrics
Steven N Durlauf, University of Wisconsin-Madison,
Paul A Johnson, Vassar College, New York State, and
Jonathan R.W Temple, Bristol University
Steven N Durlauf, University of Wisconsin-Madison,
Paul A Johnson, Vassar College, New York State, and
Jonathan R.W Temple, Bristol University
Thorsten Beck, European Banking Center, Tilburg University, and CEPR
Part IX Spatial Econometrics
Luc Anselin, School of Geographical Sciences and Urban Planning,
and Nancy Lozano-Gracia, GeoDa Center for Geospatial
Analysis and Computation, Arizona State University
Sergio J Rey, Arizona State University, and Julie Le Gallo,
Université de Franche-Comté
Part X Applied Econometrics and Computing
B.D McCullough, Drexel University
29 Trends in Applied Econometrics Software Development 1985–2008:
An Analysis of Journal of Applied Econometrics Research Articles,
Marius Ooms, VU University Amsterdam
Trang 9Notes on Contributors
Luc Anselin is Foundation Professor of Geographical Sciences and Director
of the School of Geographical Sciences and Urban Planning at Arizona StateUniversity, USA
Birmingham, UK
Gunnar Bårdsen is Professor of Economics at the Norwegian University of Science
and Technology, Norway
Thorsten Beck is Professor of Economics and Chair at the European Banking
Center, Tilburg University and Research Fellow, CEPR
Colin Cameron is Professor of Economics at the University of California,
Davis, USA
Fabio Canova is ICREA Research Professor in Social Science at Universitat Pompeu
Fabra, Barcelona, Spain
Joe Cardinale is a Manager, Economics at Air Products and Chemicals, Inc., USA Michael P Clements is Professor of Economics at Warwick University, UK John DiNardo is Professor of Economics and Public Policy at the University of
Michigan, Ann Arbor, USA
George Dotsis is Lecturer in Finance at the Essex Business School, University of
New York, USA
Stephen G Hall is Professor of Economics at University of Leicester, UK.
David I Harvey is Reader in Econometrics at the School of Economics, University
of Nottingham, UK
David F Hendry is Professor of Economics and Fellow, Nuffield College, Oxford
University, UK
viii
Trang 10Brian Henry is Visiting Fellow at the National Institute of Economic and Social
Research, NIESR, UK
Javier Hualde is Ramon y Cajal Research Fellow in Economics at the Public
University of Navarra, Spain
David Jacho-Chávez is Assistant Professor of Economics at Indiana University,
Bloomington, USA
Paul A Johnson is Professor of Economics at Vassar College, New York State, USA Andrew M Jones is Professor of Economics at the University of York, UK Katarina Juselius is Professor of Empirical Time Series Econometrics at the
University of Copenhagen, Denmark
Ignacio N Lobato is Professor of Econometrics at the Instituto Tecnológico
Autónomo de México, Mexico
Nancy Lozano-Gracia is Postdoctoral Research Associate in the GeoDa Center for
Geospatial Analysis and Computation at Arizona State University, USA
Raphael N Markellos is Assistant Professor of Quantitative Finance at the Athens
University of Economics and Business (AUEB), Greece
Bruce D McCullough is Professor of Decision Sciences and Economics at Drexel
University, Philadelphia, USA
Terence C Mills is Professor of Applied Statistics and Econometrics at
Loughborough University, UK
James Mitchell is Research Fellow at the National Institute of Economic and Social
Research, UK
Ragnar Nymoen is Professor of Economics at University of Oslo, Norway Marius Ooms is Associate Professor of Econometrics at the VU University,
Amsterdam, The Netherlands
Kerry Patterson is Professor of Econometrics at the University of Reading, UK Efthymios G Pavlidis is Lecturer in Economics at the Lancaster University
Management School, Lancaster University, UK
Ivan Paya is Senior Lecturer in Economics at the Lancaster University Management
School, Lancaster University, UK
David A Peel is Professor in Economics at the Lancaster University Management
School, Lancaster University, UK
D Stephen G Pollock is Professor of Economics at the University of Leicester, UK Tommaso Proietti is Professor of Economic Statistics at the University of Rome
‘Tor Vergata’, Italy
Sergio Rey is Professor of Geographical Sciences at Arizona State University, USA Larry W Taylor is Professor of Economics at the College of Business and
Economics, Lehigh University, Pennsylvania, USA
Trang 11Jonathan R.W Temple is Professor of Economics at Bristol University, UK Pravin Trivedi is Professor of Economics at Indiana University, Bloomington, USA Ruey S Tsay is Professor of Econometrics and Statistics at the University of Chicago
Booth School of Business, USA
Martin Wagner is Senior Economist at the Institute for Advanced Studies in
Vienna, Austria
Trang 12Editors’ Introduction
Terence C Mills and Kerry Patterson
The Palgrave Handbook of Econometrics was conceived to provide an
understand-ing of major developments in econometrics, both in theory and in application.Over the last twenty-five years or so, econometrics has grown in a way that fewcould have contemplated, and it became clear to us, as to others, that no sin-gle person could have command either of the range of technical knowledge thatunderpins theoretical econometric developments or the extent of the application
of econometrics In short, econometrics is not, as it used to be considered, a set oftechniques that is applied to a previously well-defined problem in economics; it isnot a matter of finding the “best” estimator from a field of candidates, applyingthat estimator and reporting the results The development of economics is nowinextricably entwined with the development of econometrics
The first Nobel Prize in Economics was awarded to Ragnar Frisch and Jan gen, both of whom made significant contributions to what we now recognize asapplied econometrics More recently, Nobel Prizes in Economics have been awarded
Tinber-to Clive Granger, Robert Engle, James Heckman and Daniel McFadden, who haveall made major contributions to applied econometrics It is thus clear that the dis-cipline has recognized the influential role of econometrics, both theoretical andapplied, in advancing economic knowledge
The aim of this volume is to make major developments in applied rics accessible to those outside their particular field of specialization The response
economet-to Volume 1 was universally encouraging and it has become clear that we werefortunate to be able to provide a source of reference for others for many years tocome We hope that this high standard is continued and achieved here Typically,applied econometrics, unlike theoretical econometrics, has always been ratherpoorly served for textbooks, making it difficult for both undergraduate and post-graduate students to get a real “feel” for how econometrics is actually done Tosome degree, the econometric textbook market has responded, so that now theleading textbooks include many examples; even so, these examples typically are
of an illustrative nature, focusing on simple points, simply exposited, rather than
on the complexity that is revealed in practice Thus our hope is that this ume will provide a genuine entry into the detailed considerations that have to be
vol-xi
Trang 13made when combining economics and econometrics in order to carry out seriousempirical research.
As in the case of Volume 1, the chapters here have been specially commissionedfrom acknowledged experts in their fields; further, each of the chapters has beenreviewed by the editors, one or more of the associate editors and a referee Thus,the process is akin to submission to a journal; however, whilst ensuring the higheststandards in the evaluation process, the chapters have been conceived of as part
of a whole rather than as a set of unrelated contributions It has not, however,been our intention to provide just a series of surveys or overviews of some areas ofapplied econometrics, although the survey element is directly or indirectly served
in part here By its very nature, this volume is about econometrics as it is appliedand, to succeed in its aim, the contributions, conceived as a whole, have to meetthis goal
We have organized the chapters of this volume of the Handbook into ten parts.
The parts are certainly not watertight, but serve as a useful initial organization ofthe central themes Part I contains three chapters under the general heading of
“The Methodology and Philosophy of Applied Econometrics.” The lead chapter is
by David Hendry, who has been making path-breaking contributions in theoreticaland applied econometrics for some forty years or so It is difficult to conceive howeconometrics would have developed without David’s many contributions Thischapter first places the role of applied econometrics in an historical context andthen develops a theory of applied econometrics As might be expected, the keyissues are confronted head-on
In introducing the first volume we noted that the “growth in econometrics is to
be welcomed, for it indicates the vitality and importance of the subject Indeed,this growth and, arguably, the dominance over the last ten or twenty years ofeconometric developments in taking economics forward, is a notable change fromthe situation faced by the subject some twenty-five years or so ago.” Yet in Chapter
1, Hendry notes that, next to data measurement, collection and preparation, on theone hand, and teaching, on the other, “Applied Econometrics” does not have a highcredibility in the profession Indeed, whilst courses in theoretical econometrics oreconometric techniques are de rigueur for a good undergraduate or Masters degree,courses in applied econometrics have no such general status
The intricacies, possibly even alchemy (Hendry, 1980), surrounding the mix oftechniques and data seem to defy systematization; perhaps they should be keptout of the gaze of querulous students, who may – indeed should – be satisfied withillustrative examples! Often to an undergraduate or Masters student undertaking aproject, applied econometrics is the application of econometrics to data, no more,
no less, with some relief if the results are at all plausible Yet, in contrast, leading
journals, for example, the Journal of Econometrics, the Journal of Applied rics and the Journal of Business and Economic Statistics, and leading topic journals, such as the Journal of Monetary Economics, all publish applied econometric articles
Economet-having substance and longevity in their impact and which serve to change thedirection of the development of econometric theory (for a famous example, seeNelson and Plosser, 1982) To some, applying econometrics seems unsystematic
Trang 14and so empirical results are open to question; however, as Hendry shows, it ispossible to formalize a theory of applied econometrics which provides a coher-ent basis for empirical work Chapter 1 is a masterful and accessible synthesis andextension of Hendry’s previous ideas and is likely to become compulsory readingfor courses in econometrics, both theory and applied; moreover, it is completed bytwo applications using the Autometrics software (Doornik, 2007) The first appli-cation extends the work of Magnus and Morgan (1999) on US food expenditure,which was itself an update of a key study by Tobin (1950) estimating a demandfunction for food This application shows the Autometrics algorithm at work in asimple context The second application extends the context to a multiple equationsetting relating industrial output, the number of bankruptcies and patents, and realequity prices These examples illustrate the previously outlined theory of appliedeconometrics combined with the power of the Autometrics software.
In Chapter 2, Fabio Canova addresses the question of how much structure thereshould be in empirical models This has long been a key issue in econometrics, andsome old questions, particularly those of identification and the meaning of struc-ture, resurface here in a modern context The last twenty years or so have seentwo key developments in macroeconometrics One has been the development ofdynamic stochastic general equilibrium (DSGE) models Initially, such models werecalibrated rather than estimated, with the emphasis on “strong” theory in theirspecification; however, as Canova documents, more recently likelihood-based esti-mation has become the dominant practice The other key development has beenthat of extending the (simple) vector autoregression (VAR) to the structural VAR(SVAR) model Although both approaches involve some structure, DSGE models,under the presumption that the model is correct, rely more on an underlying the-ory than do SVARs So which should be used to analyze a particular set of problems?
As Canova notes: “When addressing an empirical problem with a finite amount ofdata, one has to take a stand on how much theory one wants to use to structure
the available data prior to estimation.” Canova takes the reader through the tages and shortcomings of these methodologies; he provides guidance on what to
advan-do, and what not to advan-do, and outlines a methodology that combines elements ofboth approaches
In Chapter 3, John DiNardo addresses some philosophical issues that are atthe heart of statistics and econometrics, but which rarely surface in economet-ric textbooks As econometricians, we are, for example, used to working within
a probabilistic framework, but we rarely consider issues related to what ity actually is To some degree, we have been prepared to accept the axiomatic
probabil-or measure theprobabil-oretic approach to probability, due to Kolgomprobabil-orov, and this hasprovided us with a consistent framework that most are happy to work within.Nevertheless, there is one well-known exception to this unanimity: when it comes
to the assignment and interpretation of probability measures and, in particular, theinterpretation of some key conditional probabilities; this is whether one adopts aBayesian or non-Bayesian perspective In part, the debate that DiNardo discussesrelates to the role of the Bayesian approach, but it is more than this; it concernsmetastatistics and philosophy, because, in a sense, it relates to a discussion of the
Trang 15theory of theories This chapter is deliberately thought-provoking and certainly
controversial – two characteristics that we wish to encourage in a Handbook that
aims to be more than just an overview For balance, the reader can consult Volume
1 of the Handbook, which contains two chapters devoted to the Bayesian analysis of econometric models (see Poirier and Tobias, 2006, and Strachan et al., 2006) The
reader is likely to find familiar concepts here, such as probability and testing, butonly as part of a development that takes them into potentially unfamiliar areas.DiNardo’s discussion of these issues is wide-ranging, with illustrations taken fromgambling and practical examples taken as much from science, especially medicine,
as economics One example from the latter is the much-researched question of thecausal effect of union status on wages: put simply, do unions raise wages and, if so,
by how much? This example serves as an effective setting in which to raise issuesand to show that differences in approach can lead to differences in results.For some, the proof of the pudding in econometrics is the ability to forecastaccurately, and to address some key issues concerning this aspect of economet-rics Part II contains two chapters on forecasting The first, Chapter 4, by MichaelClements and David Harvey, recognizes that quite often several forecasts are avail-able and, rather than considering a selection strategy that removes all but the best
on some criterion, it is often more fruitful to consider different ways of combiningforecasts, as suggested in the seminal paper by Bates and Granger (1969) In anintuitive sense, one forecast may be better than another, but there could still besome information in the less accurate forecast that is not contained in the moreaccurate forecast This is a principle that is finding wider application; for example,
in some circumstances, as in unit root testing, there is more than one test availableand, indeed, there may be one uniformly powerful test, yet there is still potentialmerit in combining tests
In the forecasting context, Clements and Harvey argue that the focus for tiple forecasts should not be on testing the null of equal accuracy, but on testingfor encompassing Thus it is not a question of choosing forecast A over forecast B,but of whether the combination of forecasts A and B is better than either individ-ual forecast Of course, this may be of little comfort from a structuralist point ofview if, for example, the two forecasts come from different underlying models; but
mul-it is preferable when the loss function rewards good fit in some sense Bates andGranger (1969) suggested a simple linear combination of two unbiased forecasts,with weights depending on the relative accuracy of the individual forecasts, andderived the classic result that, even if the forecasts are equally accurate in a meansquared error loss sense, then there will still be a gain in using the linear combina-tion unless the forecasts are perfectly correlated, at least theoretically Clements andHarvey develop from this base model, covering such issues as biased forecasts, non-linear combinations, and density or distribution forecasts The concept of forecastencompassing, which is not unique in practice, is then considered in detail, includ-ing complications arising from integrated variables, non-normal errors, seriallycorrelated forecast errors, ARCH errors, the uncertainty implied by model estima-tion, and the difficulty of achieving tests with the correct actual size A number ofrecent developments are examined, including the concept of conditional forecast
Trang 16evaluation (Giacomini and White, 2006), evaluating quantile forecasts, and ing the forecast loss function away from the traditional symmetric squared error.
relax-In short, this chapter provides a clear, critical and accessible evaluation of a rapidlydeveloping area of the econometrics literature
Chapter 5 is by Stephen Hall and James Mitchell, who focus on density ing There has been a great deal of policy interest in forecasting key macroeconomicvariables such as output growth and inflation, some of which has been institution-ally enshrined by granting central banks independence in inflation targeting Inturn, there has been a movement away from simply reporting point forecasts ofinflation and GDP growth in favor of a fan chart representation of the distribution
forecast-of forecasts A density forecast gives much more information than a simple pointforecast, which is included as just one realization on the outcome axis As a corol-lary, forecast evaluation should also include techniques that evaluate the accuracy,
in some well-defined sense, of the density forecast However, given that generally
we will only be able to observe one outcome (or event) per period, some thoughtneeds to be given to how the distributional aspect of the forecast is evaluated Halland Mitchell discuss a number of possibilities and also consider methods of eval-uating competing density forecasts A further aspect of density forecasting is theability of the underlying model to generate time variation in the forecast densi-ties If the underlying model is a VAR, or can be approximated by a VAR, then,subject to some qualifications, the only aspect of the forecast density which is able
to exhibit time variation is the mean; consequently, models have been developedthat allow more general time variation in the density through, for example, ARCHand GARCH errors and time-varying parameters This chapter also links in with theprevious chapter by considering combinations of density forecasts There are twocentral possibilities: the linear opinion pool is a weighted linear combination ofthe component densities, whereas the logarithmic opinion pool is a multiplicativecombination Hall and Mitchell consider the problem of determining the weights
in such combinations and suggest that predictive accuracy improves when theweights reflect shifts in volatility, a characteristic of note for the last decade or so
in a number of economies
Part III contains four chapters under the general heading of “Time Series cations.” A key area in which the concept of a time series is relevant is incharacterizing and determining trends and cycles Chapter 6, by Stephen Pollock,
Appli-is a tour de force on modeling trends and cycles, and on the possibilities andpitfalls inherent in the different approaches In the simplest of models, cyclicalfluctuations are purely sinusoidal and the trend is exponential; although simple,this is a good base from which to understand the nature of developments thatrelax these specifications Such developments include the view that economic timeseries evolve through the accumulation of stochastic shocks, as in an integratedWeiner process The special and familiar cases of the Beveridge–Nelson decompo-sition, the Hodrick–Prescott filter, the Butterworth filter and the unifying place ofWeiner–Kolgomorov filtering are all covered with admirable clarity Other consid-erations include the complications caused by the limited data that is often available
in economic applications, contrary to the convenient assumptions of theory In an
Trang 17appealing turn of phrase, Pollock refers to obtaining a decomposition of nents based on the periodogram “where components often reside within strictlylimited frequency bands which are separated by dead spaces where the spectralordinates are virtually zeros.” The existence of these “spectral dead spaces” is key
compo-to a practical decomposition of an economic time series, however achieved Inpractice, trend fitting requires judgment and a clear sense of what it is that thetrend is capturing Other critical issues covered in this chapter include the impor-tance of structural breaks, a topic that has been influential elsewhere (for example,
in questioning the results of unit root testing: Perron, 1989); and to aid the reader,practical examples are included throughout the exposition
Chapter 7, by Joe Cardinale and Larry Taylor, continues the time series theme ofanalyzing economic cycles whilst focusing on asymmetries, persistence and syn-chronization This is a particularly timely and somewhat prophetic chapter giventhat we are currently experiencing what is perhaps the deepest recession in recenteconomic history How can we analyze the critical question “When will it end?”This chapter provides the analytical and econometric framework to answer such aquestion The central point is that cycles are much more interesting than just mark-ing their peaks and troughs would suggest Whilst “marking time” is important, it
is just the first part of the analysis, and should itself be subjected to methods for tinguishing phases (for example, expansions and contractions of the output cycle).Once phases have been distinguished, their duration and characteristics become
dis-of interest; for example, do long expansions have a greater chance dis-of ending thanshort expansions? Critical to the analysis is the hazard function: “the conditional
probability that a phase will terminate in period t, given that it has lasted t or more
periods.” Cardinale and Taylor consider different models and methods of ing the hazard function and testing hypotheses related to particular versions of it.They also consider tests of duration dependence, the amplitudes of cycles, and thesynchronization of cycles for different but related variables; for example, outputand unemployment Their theoretical analysis is complemented with a detailedconsideration of US unemployment
estimat-No handbook of econometrics could be without a contribution indicating theimportance of cointegration analysis for non-stationary data In Chapter 8, Kate-rina Juselius considers one of the most enduring puzzles in empirical economics,namely, if purchasing power parity (PPP) is the underlying equilibrium state thatdetermines the relationship between real exchange rates, why is there “pronouncedpersistence” away from this equilibrium state? This has been a common finding ofempirical studies using data from a wide range of countries and different sampleperiods Juselius shows how a careful analysis can uncover important structures inthe data; however, these structures are only revealed by taking into account thedifferent empirical orders of integration of the component variables, the identifi-cation of stationary relationships between non-stationary variables, the dynamicadjustment of the system to disequilibrium states, the appropriate deterministiccomponents, and the statistical properties of the model As Juselius notes, and
in contrast to common approaches, the order of integration is regarded here as
an empirical approximation rather than a structural parameter This opens up a
Trang 18distinction between, for example, a variable being empirically I(d) rather than structurally I(d); a leading example here is the I(2) case which, unlike the I(1)
case, has attracted some “suspicion” when applied in an absolute sense to cal series The challenging empirical case considered by Juselius is the relationshipbetween German and US prices and nominal exchange rates within a sample thatincludes the period of German reunification The methodology lies firmly withinthe framework of general-to-specific modeling, in which a general unrestrictedmodel is tested down (see also Hendry, Chapter 1) to gain as much informationwithout empirical distortion A key distinction in the methodological and empir-ical analysis is between pushing and pulling forces: in the current context, pricespush whereas the exchange rate pulls PPP implies that there should be just a sin-gle stochastic trend in the data, but the empirical analysis suggests that there aretwo, with the additional source of permanent shocks being related to speculativebehaviour in the foreign exchange market
empiri-In an analysis of trends and cycles, economists often characterize the state ofthe economy in terms of indirect or latent variables, such as the output gap, coreinflation and the non-accelerating rate of inflation (NAIRU) These are variablesthat cannot be measured directly, but are nevertheless critical to policy analysis.For example, the need to take action to curb inflationary pressures is informed bythe expansionary potential in the economy; whether or not a public sector bud-get deficit is a matter for concern is judged by reference to the cyclically adjusteddeficit These concepts are at the heart of Chapter 9 by Tommaso Proietti, entitled
“Structural Time Series Models for Business Cycle Analysis,” which links with theearlier chapters by Pollock and Cardinale and Taylor Proietti focuses on the mea-surement of the output gap, which he illustrates throughout using US GDP In thesimplest case, what is needed is a framework for decomposing a time series into atrend and cycle and Proietti critically reviews a number of methods to achieve such
a decomposition, including the random walk plus noise (RWpN) model, the locallinear trend model (LLTM), methods based on filtering out frequencies associatedwith the cycle, multivariate models that bring together related macroeconomicvariables, and the production function approach The estimation and analysis of anumber of models enables the reader to see how the theoretical analysis is appliedand what kind of questions can be answered Included here are a bivariate model
of output and inflation for the US and a model of mixed data frequency, with terly observations for GDP and monthly observations for industrial production, theunemployment rate and CPI inflation The basic underlying concepts, such as theoutput gap and core inflation, are latent variables and, hence, not directly observ-able: to complete the chapter, Proietti also considers how to judge the validity ofthe corresponding empirical measures of these concepts
quar-To complete the part of the Handbook on Times Series Applications, in Chapter
10 Luis Gil-Alana and Javier Hualde provide an overview of fractional integrationand cointegration, with an empirical application in the context of the PPP debate
A time series is said to be integrated of order d, where d is an integer, if d is the
min-imum number of differences necessary to produce a stationary time series This is
a particular form of non-stationarity and one which dominated the econometrics
Trang 19literature in the 1980s and early 1990s, especially following the unit root
litera-ture However, the integer restriction on d is not necessary to the definition of an integrated series (see, in particular, Granger and Joyeux, 1980), so that d can be a
fraction – hence the term “fractionally integrated” for such series Once the integerrestriction is relaxed for a single series, it is then natural to relax it for the multivari-ate case, which leads to the idea of fractional cointegration Gil-Alana and Hualdegive an overview of the meaning of fractional integration and fractional cointegra-tion, methods of estimation for these generalized cases, which can be approached
in either the time or frequency domains, the underlying rationale for the existence
of fractionally integrated series (for example, through the aggregation of relationships), and a summary of the empirical evidence for fractionally integratedunivariate series and fractionally cointegrated systems of series The various issuesand possible solutions are illustrated in the context of an analysis of PPP for fourbivariate series It is clear that the extension of integration and cointegration totheir corresponding fractional cases is not only an important generalization of thetheory, but one which finds a great deal of empirical support
micro-One of the most significant developments in econometrics over the last twentyyears or so has been the increase in the number of econometric applications involv-ing cross-section and panel data (see also Ooms, Chapter 29) Hence Part IV isdevoted to this development One of the key areas of application is to choice sit-uations which have a discrete number of options; examples include the “whether
to purchase” decision, which has wide application across consumer goods, and the
“whether to participate” decision, as in whether to enter the labor force, to retire, or
to join a club Discrete choice models are the subject of Chapter 11 by Bill Greene,who provides a critical, but accessible, review of a vast literature The binary choicemodel is a key building block here and so provides a prototypical model with which
to examine such topics as specification, estimation and inference; it also allows theready extension to more complex models such as bivariate and multivariate binarychoice models and multinomial choice models Models involving count data arealso considered as they relate to the discrete choice framework A starting pointfor the underlying economic theory is the extension of the classical theory of con-sumer behavior, involving utility maximization subject to a budget constraint, tothe random utility model The basic model is developed from this point and a host
of issues are considered that arise in practical studies, including estimation andinference, specification tests, measuring fit, complications from endogenous right-hand-side variables, random parameters, the use of panel data, and the extension
of the familiar fixed and random effects To provide a motivating context, Greeneconsiders an empirical application involving a bivariate binary choice model This
is where two binary choice decisions are linked; in this case, in the first decisionthe individual decides whether to visit a physician, which is a binary choice, andthe second involves whether to visit the hospital, again a binary choice: togetherthey constitute a bivariate (and ordered) choice An extension of this model is toconsider the number of times that an individual visits the doctor or a hospital Thisgives rise to a counts model (the number of visits to the doctor and the number ofvisits to the hospital) with its own particular specification Whilst a natural place to
Trang 20start is the Poisson model, this, as Greene shows, is insufficient as a general work; the extension is provided and illustrated with panel data from the Germanhealth care system A second application illustrates a mixed logit and error com-ponents framework for modeling modes of transport choice (air, train, bus, car).Overall, this chapter provides an indication, through the variety of its applications,
frame-as to why discrete choice models have become such a significant part of appliedeconometrics
The theme of panel data methods and applications is continued in Chapter 12
by Andrew Jones The application of econometrics to health economics has been
an important area of development over the last decade or so However, this has notjust been a case of applying existing techniques: rather, econometrics has been able
to advance the subject itself, asking questions that had not previously been asked– and providing answers This chapter will be of interest not only to health eco-nomics specialists, but also to those seeking to understand how treatment effects inparticular are estimated and to those investigating the extent of the developmentand application of panel data methods (it is complemented by Colin Cameron
in Chapter 14) At the center of health economics is the question “What are theimpacts of specific health policies?” Given that we do not observe experimentaldata, what can we learn from non-experimental data? Consider the problem ofevaluating a particular treatment; for an individual, the treatment effect is the dif-ference in outcome between the treated and the control, but since an individual iseither treated or not at a particular time, the treatment effect cannot be observed
“Treatment” is here a general term that covers not only single medical treatmentsbut also broad policies, and herein lies its generality, since a treatment could equally
be a policy to reduce unemployment or to increase the proportion of teenagersreceiving higher education In a masterful understanding of a complex and expand-ing literature, Jones takes the reader through the theoretical and practical solutions
to the problems associated with estimating and evaluating treatment effects,
cov-ering, inter alia, identification strategies, dynamic models, estimation methods,
different kinds of data, and multiple equation models; throughout the chapterthe methods and discussion are motivated by practical examples illustrating thebreadth of applications
A key development in econometrics over the last thirty years or so has been theattention given to the properties of the data, as these enlighten the question ofwhether the underlying probability structure is stationary or not In a terminologi-cal shorthand, we refer to data that is either stationary or non-stationary Initially,this was a question addressed to individual series (see Nelson and Plosser, 1982);subsequently, the focus expanded, through the work of Engle and Granger (1987)and Johansen (1988), to a multivariate approach to non-stationarity The nextstep in the development was to consider a panel of multivariate series In Chapter
13, Anindya Banerjee and Martin Wagner bring us up to date by considering panelmethods to test for unit roots and cointegration The reader will find in this chapter
a theoretical overview and critical assessment of a vast and growing body of ods, combined with practical recommendations based on the insights obtainedfrom a wide base of substantive applications In part, as is evident in other areas
Trang 21meth-of econometric techniques and applications, theory has responded to the muchricher sources of data that have become available, not only at a micro or indi-vidual level, as indicated in Chapter 12, combined with increases in computingpower As Banerjee and Wagner note, we now have long time series on macroeco-nomic and industry-level data Compared to just twenty years ago, there is thus awealth of data on micro, industry and macro-panels A panel dataset embodies twodimensions: the cross-section dimension and the time-series dimension, so that,
in a macro-context, for example, we can consider the question of convergence notjust of a single variable (say, of a real exchange rate to a comparator, be that aPPP hypothetical or an alternative actual rate), but of a group of variables, which
is representative of the multidimensional nature of growth and cycles A startingpoint for such an analysis is to assess the unit root properties of panel data but,
as in the univariate case, issues such as dependency, the specification of istic terms, and the presence of structural breaks are key practical matters that, ifincorrectly handled, can lead to misleading conclusions Usually, the question ofunit roots is a precursor to cointegration analysis, and Banerjee and Wagner guidethe reader through the central methods, most of which have been developed inthe last decade Empirical illustrations, based on exchange rate pass-through inthe euro-area and the environmental Kuznets curve, complement the theoreticalanalysis
determin-Whilst the emphasis in Chapter 13 is on panels of macroeconomic or level data, in Chapter 14, Colin Cameron, in the first of two chapters in Part
industry-V, provides a survey of microeconometric methods, with an emphasis on recentdevelopments The data underlying such developments are at the level of theindividual, households and firms A prototypical question in microeconometricsrelates to the identification, estimation and evaluation of marginal effects usingindividual-level data; for example, the effect on earnings of an additional year ofeducation This example is often used to motivate some basic estimation meth-ods, such as least squares, maximum likelihood and instrumental variables, inundergraduate and graduate texts in econometrics, so it is instructive to see howrecent developments have extended these methods The development of the basicmethods include generalized method of moments (GMM), empirical likelihood,simulation-based methods, quantile regression and nonparametric and semipara-metric estimation, whilst developments in inference include robustifying standardtests and bootstrap methods Apart from estimation and inference, Cameron con-siders a number of other issues that occur frequently in microeconometric studies:
in particular, issues related to causation, as in estimating and evaluating treatmenteffects; heterogeneity, for example due to regressors or unobservables; and thenature of microeconometric data, such as survey data and the sampling scheme,with problems such as missing data and measurement error
The development of econometrics in the last decade or so in particular has beensymbiotic with the development of advances in computing, particularly that of per-sonal computers In Chapter 15, David Jacho-Chávez and Pravin Trivedi focus onthe relationship between empirical microeconometrics and computational consid-erations, which they call, rather evocatively, a “matrimony” between computing
Trang 22and applied econometrics No longer is it the case that the mainstay of empiricalanalysis is a set of macroeconomic time series, often quite limited in sample period.Earlier chapters in this part of the volume emphasize that the data sources nowavailable are much richer than this, both in variety and length of sample period.
As Jacho-Chávez and Trivedi note, the electronic recording and collection of datahas led to substantial growth in the availability of census and survey data However,the nature of the data leads to problems that require theoretical solutions: for exam-ple, problems of sample selection, measurement errors and missing or incompletedata On the computing side, the scale of the datasets and estimation based uponthem implies that there must be reliability in the high-dimensional optimizationroutines required by the estimation methods and an ability to handle large-scaleMonte Carlo simulations The increase in computing power has meant that tech-niques that were not previously feasible, such as simulation assisted estimationand resampling, are now practical and in widespread use Moreover, nonparamet-ric and semiparametric methods that involve the estimation of distributions ratherthan simple parameters, as in regression models, have been developed throughdrawing on the improved power of computers Throughout the chapter, Jacho-Chávez and Trivedi motivate their discussion by the use of examples of practicalinterest, including modeling hedonic prices of housing attributes, female laborforce participation, Medicare expenditure, and number of doctor visits Interest-ingly, they conclude that there are important problems, particularly those related
to assessing public policy, such as identification and implementation in the text of structural, dynamic and high-dimensional models, which remain to besolved
con-In Part VI, the theme of the importance of economic policy is continued, butwith the emphasis now on monetary policy and macroeconomic policy, whichremain of continued importance Starting in the 1970s and continuing into the1990s, the development of macroeconometric models for policy purposes was ahighly regarded area; during that period computing power was developing pri-marily through mainframe computers, allowing not so much the estimation as thesimulation of macroeconomic models of a dimension that had not been previouslycontemplated Government treasuries, central banks and some non-governmentalagencies developed their own empirical macro-models comprising hundreds ofequations Yet, these models failed to live up to their promise, either wholly or inpart For some periods there was an empirical failure, the models simply not beinggood enough; but, more radically, the theoretical basis of the models was oftenquite weak, at least relative to the theory of the optimizing and rational agent andideas of intertemporal general equilibrium
In Chapter 16, Carlo Favero expands upon this theme, especially as it relates tothe econometrics of monetary policy and the force of the critiques by Lucas (1976)and Sims (1980) A key distinction in the dissection of the modeling corpse isbetween structural identification and statistical identification The former relates tothe relationship between the structural parameters and the statistical parameters inthe reduced form, while the latter relates to the properties of the statistical or empir-ical model which represents the data Typically, structural identification is achieved
Trang 23by parametric restrictions seeking to classify some variables as “exogenous,” a taskthat some have regarded as misguided (or indeed even “impossible”) Further, afailure to assess the validity of the reduction process in going from the (unknown)data-generating process to a statistical representation, notwithstanding criticismsrelated to structural identification, stored up nascent empirical failure awaiting themacreconometric model Developments in cointegration theory and practice have
“tightened” up the specification of empirical macromodels, and DSGE models, ferred theoretically by some, have provided an alternative “modellus operandi.”Subsequently, the quasi-independence of some central banks has heightened thepractical importance of questions such as “How should a central bank respond toshocks in macroeconomic variables?” (Favero, Chapter 16) In practice, althoughDSGE models are favored for policy analysis, in their empirical form the VARreappears, but with their own set of issues Favero considers such practical develop-ments as calibration and model evaluation, the identification of shocks, impulseresponses, structural stability of the parameters, VAR misspecification and factoraugmented VARs A summary and analysis of Sims’ (2002) small macroeconomicmodel (Appendix A) helps the reader to understand the relationship between anoptimizing specification and the resultant VAR model
pre-In Chapter 17, Gunnar Bårdsen and Ragnar Nymoen provide a paradigm forthe construction of a dynamic macroeconometric model, which is then illus-trated with a small econometric model of the Norwegian economy that is usedfor policy analysis Bårdsen and Nymoen note the two central critiques of “failed”macroeconometric models: the Lucas (1976) critique and the Clements and Hendry(1999) analysis of forecast failure involving “location” shifts (rather than behav-ioral parameter shifts) But these critiques have led to different responses; first, themove to explicit optimizing models (see Chapter 16); and, alternatively, to greaterattention to the effects of regime shifts, viewing the Lucas critique as a possibilitytheorem rather than a truism (Ericsson and Irons, 1995) Whilst it is de rigueur
to accept that theory is important, Bårdsen and Nymoen consider whether ory” provides the (completely) correct specification or whether it simply provides aguideline for the specification of an empirical model In their approach, the under-lying economic model is nonlinear and specified in continuous time; hence, thefirst practical steps are linearization and discretization, which result in an equilib-rium correction model (EqCM) Rather than remove the data trends, for example
“the-by applying the HP filter, the common trends are accounted for through a gration analysis The approach is illustrated step by step by building a small-scaleeconometric model of the Norwegian economy, which incorporates the ability toanalyze monetary policy; for example, an increase in the market rate, which showsthe channels of the operation of monetary policy Further empirical analysis of theNew Keynesian Phillips curve provides an opportunity to illustrate their approach
cointe-in another context In summary, Bårdsen and Nymoen note that cocointe-integrationanalysis takes into account non-stationarities that arise through unit roots, so thatforecast failures are unlikely to be attributable to misspecification for that reason
In contrast to the econometric models of the 1970s, the real challenges arise fromnon-stationarities in functional relationships due to structural breaks; however,
Trang 24there are ways to “robustify” the empirical model and forecasts from it so as tomitigate such possibilities, although challenges remain in an area that continues
to be of central importance in economic policy
One of the key developments in monetary policy in the UK and elsewhere inthe last decade or so has been the move to give central banks a semi-autonomousstatus In part, this was thought to avoid the endogenous “stop–go” cycle driven bypolitical considerations It also carried with it the implication that it was monetarypolicy, rather than fiscal policy, which would become the major macroeconomicpolicy tool, notwithstanding the now apparent practical limitations of such amove In Chapter 18, Brian Henry provides an overview of the institutional andtheoretical developments in the UK in particular, but with implications for othercountries that have taken a similar route The key question that is addressed inthis chapter is whether regime changes, such as those associated with labor marketreforms, inflation targeting and instrument independence for the Bank of Eng-land, have been the key factors in dampening the economic cycle and improvinginflation, unemployment and output growth, or whether the explanation is moreone of beneficial international events (the “good luck” hypothesis) and monetarypolicy mistakes Henry concludes, perhaps controversially, that the reforms to thelabor market and to the operation of the central bank are unlikely to have been thefundamental reasons for the improvement in economic performance He provides
an econometric basis for these conclusions, which incorporates a role for tional factors such as real oil prices and measures of international competitiveness.Once these factors are taken into account, the “regime change” explanation losesforce
interna-The growth of financial econometrics in the last two decades was noted in the
first volume of this Handbook Indeed, this development was recognized in the
award of the 2003 Nobel Prize in Economics (jointly with Sir Clive Granger) toRobert Engle for “methods of analyzing economic time series with time-varyingvolatility (ARCH).” Part VII of this volume reflects this development and is thusdevoted to applications in the area of financial econometrics
In Chapter 19, George Dotsis, Raphael Markellos and Terence Mills considercontinuous-time stochastic volatility models What is stochastic volatility? Toanswer that question, we start from what it is not Consider a simple model of an
asset price, Y(t), such as geometric Brownian motion, which in continuous time takes the form of the stochastic differential equation dY (t) = μY(t) + σ Y(t)dW(t), where W(t) is a standard Brownian motion (BM) input; then σ (or σ2) is the volatil-
ity parameter that scales the stochastic BM contribution to the diffusion of Y(t).
In this case the volatility parameter is constant, although the differential equation
is stochastic However, as Dotsis et al note, a more appropriate specification for
the accepted characteristics of financial markets is a model in which volatilityalso evolves stochastically over time For example, if we introduce the variance
function v(t), then the simple model becomes dY(t) = μY(t) +√v(t)Y(t)dW(t),
and this embodies stochastic volatility Quite naturally, one can then couple thisequation with one that models the diffusion over time of the variance function.ARCH/GARCH models are one way to model time-varying volatility, but there are
Trang 25a number of other attractive specifications; for example, jump diffusions, affinediffusions, affine jump diffusions and non-affine diffusions In motivating alterna-
tive specifications, Dotsis et al note some key empirical characteristics in financial
markets that underlie the rationale for stochastic volatility models, namely fattails, volatility clustering, leverage effects, information arrivals, volatility dynam-ics and implied volatility The chapter then continues by covering such issues asspecification, estimation and inference in stochastic volatility models A compar-ative evaluation of five models applied to the S&P 500, for daily data over theperiod 1990–2007, is provided to enable the reader to see some of the models “inaction.”
One of the most significant ideas in the area of financial econometrics is that theunderlying stochastic process for an asset price is a martingale Consider a stochas-
tic process X = (X t , X t−1, .), which is a sequence of random variables; then the martingale property is that the expectation (at time t −1) of X t, conditional on the
information set I t−1= (X t−1, X t−2, .), is X t−1; that is, E(X t |I t−1) = X t−1(almost
surely), in which case, X is said to be a martingale (the definition is sometimes
phrased in terms of theσ -field generated by I t−1, or indeed some other
“filtra-tion”) Next, define the related process Y = (X t,X t−1, .); then Y is said to be a martingale difference sequence (MDS) The martingale property for X translates to the property for Y that E(Y t |I t−1) = 0 (see, for example, Mikosch, 1998, sec 1.5).
This martingale property is attractive from an economic perspective because of its
link to efficient markets and rational expectations; for example, in terms of X, the
martingale property says that the best predictor, in a minimum mean squared error
values of Y t Tests of the MDH can be seen as being translated to the equivalent
form given by E[(Y t − μ)w(I t−1)], where w(I t−1) is a weighting function A useful
means of organizing the extant tests of the MDH is in terms of the type of functions
w(.) that are used For example, if w(I t−1) = Y t −j , j≥ 1, then the resulting MDH
test is of E[(Y t − μ)Y t −j ] = 0, which is just the covariance between Y t and Y t −j.
This is just one of a number of tests, but it serves to highlight some generic issues
In principle, the condition should hold for all j ≥ 1 but, practically, j has to be truncated to some finite value Moreover, this is just one choice of w(I t−1), whereas
the MDH condition is not so restricted Escanciano and Lobato consider issues such
as the nature of the conditioning set (finite or infinite), robustifying standard teststatistics (for example, the Ljung–Box and Box–Pierce statistics), and developingtests in both the time and frequency domains; whilst standard tests are usually
of linear dependence, for example autocorrelation based tests, it is important toconsider tests based on nonlinear dependence To put the various tests into context,the chapter includes an application to four daily and weekly exchange rates againstthe US dollar The background to this is that the jury is out in terms of a judgment
on the validity of the MDH for such data; some studies have found against the
Trang 26MDH, whereas others have found little evidence against it In this context, applying
a range of tests, Escanciano and Lobato find general support for the MDH
Chapter 19 by Dotsis et al was concerned with models of stochastic volatility,
primarily using the variance as a measure of volatility Another measure of ity is provided by the range of a price; for example, the trading day range of anasset price In turn, the range can be related to the interval between consecutivetrades, known as the duration Duration is a concept that is familiar from countingprocesses, such as the Poisson framework for modeling arrivals (for example, at asupermarket checkout or an airport departure gate)
volatil-Chapter 21 by Ruey Tsay provides an introduction to modeling duration that
is illustrated with a number of financial examples That duration can carry mation about market behavior is evident not only from stock markets, where acluster of short durations indicates active trading relating to, for example, informa-tion arrival, but from many other markets; for example, durations in the housingmarket and their relation to banking failure The interest in durations modelingowes much to Engle and Russell (1998), who introduced the autoregressive con-ditional duration (ACD) model for irregularly spaced transactions data Just as theARCH/GARCH family of models was introduced to capture volatility clusters, theACD model captures short-duration clusters indicating the persistence of periods
infor-of active trading, perhaps uncovering and evaluating information arrivals To see
how an ACD model works, let the ith duration be denoted x i = t i − t i−1, where
t i is the time of the ith event, and model x i as x i = ψ i ε i, where {ε i} is an i.i.dsequence andβ(L)ψ i= α0+ α(L)x i, whereα(L) and β(L) are lag polynomials; this is
the familiar GARCH form, but in this context it is known as the exponential ACD
or EACD To accommodate the criticism that the hazard function of duration isnot constant over time, unlike the assumption implicit in the EACD model, alter-native innovation distributions have been introduced, specifically the Weibull andthe Gamma, leading to the Weibull ACD (WACD) and the Gamma ACD (GACD).The chapter includes some motivating examples Evidence of duration clusters isshown in Figures 21.1, 21.4 and 21.7a for IBM stock, Apple stock and GeneralMotors stock, respectively The development and application of duration modelscan then exploit the development of other forms of time series models, such as(nonlinear) threshold autoregressive (TAR) models ACD models have also beendeveloped to incorporate explanatory variables; an example is provided, whichshows that the change to decimal “tick” sizes in the US stock markets reduced theprice volatility of Apple stock
The determination of exchange rates has long been an interest to metricians and, as a result, there is an extensive literature that includes twoconstituencies; on the one hand, there have been contributions from economistswho have employed econometric techniques and, on the other, to risk a simplebifurcation, the modeling of exchange rates has become an area to test out advances
econo-in nonlecono-inear econometrics Chapter 22, by Efthymios Pavlidis, Ivan Paya and DavidPeel, provides an evaluative overview of this very substantial area As they note,the combination of econometric developments, the availability of high-qualityand high-frequency data, and the move to floating exchange rates in 1973, has led
Trang 27to a considerable number of empirical papers in this area Thus, the question of
“Where are we now?” is not one with a short answer Perhaps prototypically, theeconometrics of exchange rates is an area that has moved in tandem with devel-opments in the economic theory of exchange rates (for the latter, the reader isreferred to, for example, Sarno and Taylor, 2002) An enduring question over thelast thirty years (at least), and one that is touched upon in two earlier chapters(Juselius, Chapter 8, and Gil-Alana and Hualde, Chapter 10), has been the status ofPPP, regarded as a bedrock of economic theory and macroeconomic models Oneearly finding that has puzzled many is the apparent failure to find PPP supported
by a range of different exchange rates and sample periods Consider a stylized sion of the PPP puzzle: there are two countries, with a freely floating exchangerate, flexible prices (for tradable goods and services), no trade constraints, and
ver-so on In such a situation, at least in the long run, the nominal exchange rateshould equal the ratio of the (aggregate) price levels, otherwise, as the price ratiomoves, the nominal exchange rate does not compensate for such movements andthe real exchange rate varies over time, contradicting PPP; indeed, on this basisthe exchange rate is not tied to what is happening to prices Early studies used anessentially linear framework – for example, ARMA models combined with unit roottests – to evaluate PPP, and rarely found that it was supported by the data; more-over, estimated speeds of adjustment to shocks were so slow as to be implausible.Another puzzle, in which tests indicated that the theory (of efficient speculativemarkets) was not supported, was the “forward bias puzzle.” In this case, the pre-diction was that prices should fully reflect publicly available information, so that
it should not be possible to make a systematic (abnormal) return; however, this
appeared not to be the case In this chapter, Pavlidis et al carefully dissect this and
other puzzles and show how the move away from simple linear models to a range
of essentially nonlinear models, the development and application of multivariatemodels, and the use of panel data methods, has provided some explanation of theexchange rate “puzzles.”
Part VIII of this volume of the Handbook is comprised of three chapters related to
what has become referred to as “growth econometrics”; broadly speaking, this is thearea that is concerned with variations in growth rates and productivity levels acrosscountries or regions Chapters 23 and 24 are a coordinated pair by Steven Durlauf,Paul Johnson and Jonathan Temple; in addition, looking ahead, Chapter 27 bySerge Rey and Julie Le Gallo takes up aspects of growth econometrics, with an
emphasis on spatial connections In Chapter 23, Durlauf et al focus on the
econo-metrics of convergence Of course, convergence could and does mean a number ofthings: first, the convergence of what? Usually this is a measure of income or out-put but, in principle, the question of whether two (or more) economies are/haveconverged relates to multiple measures, for example, output, inflation, unemploy-ment rates, and so on, and possibly includes measures of social welfare, such asliteracy and mortality rates The first concept to be considered in Chapter 23 isβ-convergence (so-called because the key regression coefficient is referred to as β):consider two countries; there isβ-convergence if the one with a lower initial incomegrows faster than the other and so “catches up” with the higher-income country
Trang 28Naturally, underlying the concept of convergence is an economic model, cally a neoclassical growth model (with diminishing returns to capital and labor),which indicates the sources of economic growth and a steady-state which theeconomy will (eventually) attain At its simplest, growth econometrics leads tocross-country regressions of output growth rates on variables motivated from theunderlying growth model and, usually, some “control” variables that, additionally,are thought to influence the growth rate It is the wide range of choice for thesecontrol variables, and the resultant multiplicity of studies, that has led to the, per-haps pejorative, description of this activity as the “growth regression industry.”One response has been the technique of model averaging, so that no single modelwill necessarily provide the empirical wisdom A second central convergence con-cept isσ-convergence As the notation suggests, this form of convergence relates
typi-to the cross-section dispersion of a measure, usually log per capita output, across
countries As Durlauf et al note, whilst many studies use the log variance, other
measures, such as the Gini coefficient or those suggested in Atkinson (1970), may
be preferred In this measure of convergence, a reduction in the dispersion sure across countries suggests that they are getting closer together As in Chapter
mea-22 on exchange rates, an important methodological conclusion of Durlauf et al is
that nonlinearity (due in this case to endogenous growth models) is likely to be
an important modeling characteristic, which is not well captured in many existingstudies, whether based on cross-section or panel data
Having considered the question of convergence in Chapter 23, in Chapter 24
Durlauf et al turn to the details of the methods of growth econometrics Whilst
concentrating on the methods, they first note some salient facts that inform thestructure of the chapter Broadly, these are that: vast income disparities exist despitethe general growth in real income; distinct winners and losers have begun toemerge; for many countries, growth rates have tended to slow, but the dispersion
of growth rates has increased At the heart of the growth literature is the one-sectorneoclassical growth model, transformed to yield an empirical form in terms of thegrowth rate of output per labor unit, such that growth is decomposed into growthdue to technical progress and the gap between initial output per worker and thesteady-state value Typically, an error is then added to a deterministic equationderived in this way and this forms the basis of a cross-country regression, usu-ally augmented with “control” variables that are also thought to influence growth
rates However, as Durlauf et al note, there are a number of problems with this
approach; for example, the errors are implicitly assumed to be exchangeable, butcountry dependence of the errors violates this assumption; the plethora of selectedcontrol variables leads to a multiplicity of empirical models; and parameter hetero-geneity To assess the question of model uncertainty an extreme bounds analysis(Leamer, 1983) can be carried out, and model averaging as in a Bayesian analysiscan be fruitful Parameter heterogeneity is related to the Harberger (1987) criti-cism that questions the inclusion of countries with different characteristics in across-country regression The key to criticisms of this nature is the meaning ofsuch regressions: is there a DGP that these regressions can be taken as empirically
parameterizing? The chapter continues by providing, inter alia, an overview of the
Trang 29different kinds of data that have been used and an assessment of the econometricproblems that have arisen and how they have been solved; the conclusion evaluatesthe current state of growth econometrics, and suggests directions for futureresearch.
A concern that has long antecedents is the relationship between financial opment and growth: is there a causal relationship from the former to the latter? InChapter 25, Thorsten Beck evaluates how this key question has been approachedfrom an econometric perspective Do financial institutions facilitate economicgrowth, for example by reducing information asymmetries and transaction costs?Amongst other functions, as Beck notes, financial institutions provide paymentservices, pool and allocate savings, evaluate information, exercise corporate gov-
devel-ernance and diversify risk It would seem, a priori, that the provision of such
services must surely move out the aggregate output frontier However, just findingpositive correlations between indicators of financial development, such as mon-etization measures, the development of banking institutions and stock markets,and economic growth is insufficient evidence from an econometric viewpoint.One of the most fundamental problems in econometrics is the problem of iden-tification: by themselves, the correlations do not provide evidence of a causaldirection Beck takes the reader through the detail of this problem and how it hasbeen approached in the finance-growth econometric literature A classical methodfor dealing with endogenous regressors is instrumental variables (IV) and, in thiscontext, some ingenuity has been shown in suggesting such variables, includingexogenous country characteristics; for example, settler mortality, latitude and eth-nic fractionalization Early regression-based studies used cross-section data on anumber of countries; however, more recent datasets now include dynamic panelsand methods include GMM and cointegration More recent developments havebeen able to access data at the firm and household level, and this has led to muchlarger samples being used For example, Beck, Dermirgüç-Kunt and Makisimovic(2005) use a sample of over 4,000 firms in 54 countries to consider the effect ofsales growth as a firm-level financing obstacle as well as other variables, including acountry-level financial indicator As Beck notes, the evidence suggests a strong casefor a causal link between financial development and economic growth, but there
is still much to be done both in terms of techniques, such as GMM, and exploitingadvances at the micro-level
In Volume 1 of the Handbook, we highlighted recent developments in theoretical
econometrics as applied to problems with a spatial dimension; this is an area thathas grown in application and importance, particularly over the last decade, and
it is natural that we should continue to emphasize its developmental importance
by including two chapters in Part IX These chapters show how spatial rics can bring into focus the importance of the dimension of space in economicdecisions and the particular econometric problems and solutions that result InChapter 26, Luc Anselin and Nancy Lozano-Gracia consider spatial hedonic mod-els applied to house prices Hedonic price models are familiar from microeconomicsand, in particular, from the seminal contributions of Lancaster (1966) and Rosen(1974) In the context of house prices, there are key characteristics, such as aspects
Trang 30economet-of neighborhood, proximity to parks, schools, measures economet-of environmental quality,and so on, that are critical in assigning a value to a house These characteristicslead to the specification of a hedonic price function to provide an estimate of themarginal willingness to pay (MWTP) for a characteristic; a related aim, but onenot so consistently pursued, is to retrieve the implied inverse demand functionfor house characteristics Two key problems in the estimation of hedonic houseprice functions, in particular, are spatial dependence and spatial heterogeneity AsAnselin and Lozano-Gracia note, spatial dependence, or spatial autocorrelation,recognizes the importance of geographical or, more generally, network space inleading to a structure in the covariance matrix between observations Whilst there
is an analogy with temporal autocorrelation, spatial autocorrelation is not simply
an extension of that concept, but requires its own conceptualization and methods.Spatial heterogeneity can be viewed as a special case of structural instability; two(of several) examples of heterogeneity are spatial regimes (for example, ethnicallybased sub-neighorhoods) and spatially varying coefficients (for example, differentvaluations of housing and neighborhood characteristics) In this chapter, Anselinand Lozano-Gracia provide a critical overview of methods, such as spatial two-stageleast squares and spatial feasible GLS, a summary of the literature on spatial depen-dence and spatial heterogeneity, and discussion of the remaining methodologicalchallenges
In Chapter 27, Serge Rey and Julie Le Gallo consider an explicitly spatial analysis
of economic convergence Recall that Chapter 23, by Durlauf et al., is
con-cerned with the growing interest in the econometrics of convergence; for example,whether there was an emergence of convergence clubs, perhaps suggesting “win-ners and losers” in the growth race There is an explicitly spatial dimension tothe evaluation of convergence; witness, for example, the literature on the con-vergence of European countries or regions, the convergence of US states, and so
on Rey and Le Gallo bring this spatial dimension to the fore The recognition ofthe importance of this dimension brings with it a number of problems, such asspatial dependence and spatial heterogeneity; these problems are highlighted inChapter 26, but in Chapter 27 they are put in the context of the convergence ofgeographical units Whilst Rey and Le Gallo consider what might be regarded aspurely econometric approaches to these problems, they also show how exploratorydata analysis (EDA), extended to the spatial context, has been used to inform thetheoretical and empirical analysis of convergence As an example, a typical focus
in a non-spatial context is onσ -convergence, which relates to a cross-sectional
dispersion measure, such as the variance of log per capita output, across regions orcountries However, in a broader context, there is interest in the complete distri-bution of regional incomes and the dynamics of distributional change, leading to,for example, the development of spatial Markov models, with associated conceptssuch as spatial mobility and spatial transition EDA can then provide the tools tovisualize what is happening over time: see, for example, the space-time paths andthe transition of regional income densities shown in Figures 27.5 and 27.6 Rey and
Le Gallo suggest that explicit recognition of the spatial dimension of convergence,combined with the use of EDA and its extensions to include the spatial element,
Trang 31offers a fruitful way of combining different methods to inform the overall view onconvergence.
Part X comprises two chapters on applied econometrics and its relationship
to computing In Chapter 28, Bruce McCullough considers the problem of ing econometric software The importance of this issue is hard to understate.Econometric programs that are inaccurate, for any reason, will produce misleadingresults not only for the individual researcher but, if published, for the professionmore generally, and will lead to applications that are impossible to replicate Thedevelopment of sophisticated methods of estimation means that we must also beever-vigilant in ensuring that software meets established standards of accuracy Aseminal contribution to the development of accuracy benchmarks was Longley(1967) As McCullough notes, Longley worked out by hand the solution to a lin-ear regression problem with a constant and six explanatory variables When runthrough the computers of the time, he found that the answers were worryinglydifferent Of course, the Longley benchmark is now passed by the economet-ric packages that are familiar to applied econometricians However, the nature
test-of the problems facing the prtest-ofession is different (sophisticated estimators, largedatasets, simulation-based estimators) and McCullough’s results imply that there
is no reason for complacency Many econometric estimators involve problems of
a nonlinear nature – for example, the GARCH and multivariate GARCH tors and the probit estimator – and it is in the case where a nonlinear solver isinvolved that the user will find problems, especially when relying on the defaultoptions Another area that has seen substantial growth in the last two decades hasbeen the use of Monte Carlo experimentation, an area that makes fundamentaluse of random numbers, and hence any package must have a reliable randomnumber generator (RNG) But are the numbers so generated actually random?The answer is, not necessarily! (The reader may wish to refer to Volume 1 of this
estima-Handbook, which includes a chapter by Jurgen Doornik on random number
gener-ation.) The importance of maintaining standards of numerical accuracy has beenrecognised in the National Institute of Standards and Technology’s Statistical Ref-erence Datasets, which has resulted in a number of articles using these datasets toevaluate software for econometric problems To illustrate some of the issues in soft-ware evaluation, for example in establishing a benchmark, McCullough includes
a study of the accuracy of a number of packages to estimate ARMA models Thecentral methods for the estimation of such models include unconditional leastsquares (UCLS), conditional least squares (CLS), and exact maximum likelihood.The questions of interest are not only in the accuracy of the point estimates fromthese methods in different packages, but also what method of standard error cal-culation is being used Overall, McCullough concludes that we, as a profession,have some way to go in ensuring that the software that is being used is accurate,that the underlying methods are well-documented, and that published results arereplicable
In Chapter 29, Marius Ooms takes a historical perspective on the nature ofapplied econometrics as it has been represented by publications and reviews of
econometric and statistical software in the Journal of Applied Econometrics (JAE).
Trang 32Over the 14-year review period, 1995–2008, there were 513 research articles
pub-lished in the JAE, of which 253 were categorized as applications in time series, 140
as panel data applications and 105 as cross-section applications Ooms notes thatthere has been a gradual shift from macroeconometrics to microeconometrics and
applications using panel data The software review section of the JAE has been a
regular feature, so enabling an analysis of the programmes that have been in use –and continue to be in use, reflecting the development policy of the providers Thissection is likely to be a very useful summary for research and teaching purposes.Ooms also notes the growth of high-level programming languages, such as Gauss,MATLAB, Stata and Ox, and illustrates their use with a simple program In com-bination, the profession is now very much better served for econometric softwarethan it was twenty years ago Of course, these developments have not taken place inisolation but rather as a response to developments in theoretical and applied econo-metrics A leading example in this context, noted by Ooms, is the Arellano andBond (1991) approach to the estimation of applications using panel data (dynamicpanel data, or DPD, analysis), which led to the widespread implementation of newcode in existing software and many new applications; an example in the area oftime series applications is the growth of ARCH and GARCH-based methods and theimplantation of estimation routines in econometric software As noted in Chapter
28 by McCullough, reproducibility of results is a key aspect in the progression andreputation of applied econometrics Results that are irreproducible by reason ofeither inaccurate software or unavailability of data will do long-term harm to the
profession In this respect, the JAE, through Hashem Pesaran’s initiative, has been
a leader in the context of requiring authors to provide the data and code which
they used The JAE archive is indexed and carefully managed, and provides the
standard for other journals
As a final comment, which we hope is evident from the chapters contained inthis volume, one cannot help but be struck by the incredible ingenuity of thoseinvolved in pushing forward the frontiers of applied econometrics Had this volumebeen compiled even, say, just twenty years ago, how different would it have been!Viewed from above, the landscape of applied econometrics has changed markedly.Time series econometrics and macroeconometrics, whilst still important, are notpredominant The combination of the availability of large datasets of a microeco-nomic nature, combined with enormous increases in computing power, has meantthat econometrics is now applied to a vast range of areas What will the next twentyyears bring?
Finally, thanks are due to many in enabling this volume to appear First, ourthanks go collectively to the authors who have cooperated in contributing chapters;they have, without exception, responded positively to our several and sometimesmany requests, especially in meeting deadlines and accommodating editorial sug-gestions We hope that the quality of these chapters will be an evident record of
the way the vision of the Handbook has been embraced We would also like to
record our gratitude to the Advisory Editors for this volume: Bill Greene, PhilipHans Franses, Hashem Pesaran and Aman Ullah, whose support was invaluable,especially at an early stage
Trang 33Thanks also go the production team at Palgrave Macmillan, only some of whomcan be named individually: Taiba Batool, the commissioning editor, Ray Addicott,the production editor, and Tracey Day, the indefatigable copy-editor A specialmention goes to Lorna Eames, secretary to one of the editors, for her willing andinvaluable help at several stages in the project.
References
Arellano, M and S Bond (1991) Some tests of specification for panel data: Monte Carlo
evidence and an application to employment equations Review of Economic Studies 58,
177–97.
Atkinson, A.B (1970) On the measurement of inequality Journal of Economic Theory 2, 244–63.
Bates, J.M and C.W.J Granger (1969) The combination of forecasts Operations Research
Quarterly 20, 451–68.
Beck, T., A Demirgüç–Kunt and V Maksimovic (2005) Financial and legal constraints to firm
growth: does firm size matter? Journal of Finance 60, 137–77.
Clements, M.P and D.F Hendry (1999) Forecasting Non-stationary Economic Time Series.
Cambridge, Mass.: MIT Press.
Doornik, J.A (2007) Autometrics Working paper, Economics Department, University of Oxford.
Engle, R.F and C.W.J Granger (1987) Co-integration and error-correction: representation,
estimation and testing Econometrica 55, 251–76.
Engle, R.F and J.R Russell (1998) Autoregressive conditional duration: a new model for
irregularly spaced transaction data Econometrica 66, 1127–62.
Ericsson, N.R and J.S Irons (1995) The Lucas critique in practice: theory without
measure-ment In K.D Hoover (ed.), Macroeconometrics: Developments, Tensions and Prospects, Ch 8.
Dordrecht: Kluwer Academic Publishers.
Giacomini, R and H White (2006) Tests of conditional predictive ability Econometrica 74,
1545–78.
Granger, C.W.J and R Joyeux (1980) An introduction to long memory time series and
fractional differencing Journal of Time Series Analysis 1, 15–29.
Harberger, A (1987) Comment in S Fischer (ed.), Macroeconomics Annual 1987 Cambridge,
Mass.: MIT Press.
Hendry, D.F (1980) Econometrics: alchemy or science? Economica 47, 387–406.
Johansen, S (1988) Statistical analysis of cointegration vectors Journal of Economic Dynamics
and Control 12, 231–54.
Koop, G., R Strachan, H van Dijk and M Villani (2006) Bayesian approaches to
cointegra-tion In T.C Mills and K.D Patterson (eds.), Palgrave Handbook of Econometrics, Volume 1: Econometric Theory, pp 871–900.
Lancaster, K.J (1966) A new approach to consumer theory Journal of Political Economy 74,
132–56.
Leamer, E (1983) Let’s take the con out of econometrics American Economic Review 73, 31–43.
Longley, J.W (1967) An appraisal of least-squares programs from the point of view of the
user Journal of the American Statistical Association 62, 819–41.
Lucas, R.E (1976) Econometric policy evaluation: a critique In K Brunner and A Meltzer
(eds.), The Phillips Curve and Labour Markets, Volume 1 of Carnegie-Rochester Conferences on Public Policy, pp 19–46 Amsterdam: North-Holland.
Magnus, J.R and M.S Morgan (eds.) (1999) Methodology and Tacit Knowledge: Two Experiments
in Econometrics Chichester: John Wiley and Sons.
Mikosch, T (1998) Elementary Stochastic Calculus London and New Jersey: World Scientific
Publishers.
Trang 34Nelson, C.R and C.I Plosser (1982) Trends and random walks in macroeconomic time series.
Journal of Monetary Economics 10, 139–62.
Perron, P (1989) The great crash, the oil price shock and the unit root hypothesis Econometrica
57, 1361–401.
Poirier, D.J and J.L Tobias (2006) Bayesian econometrics In T.C Mills and K.D
Patter-son (eds.), Palgrave Handbook of Econometrics, Volume 1: Econometric Theory, pp 841–70.
Basingstoke: Palgrave Macmillan.
Rosen, S.M (1974) Hedonic prices and implicit markets: product differentiation in pure
competition Journal of Political Economy 82, 534–57.
Sarno, L and M Taylor (2002) The Economics of Exchange Rates Cambridge and New York:
Cambridge University Press.
Sims, C.A (1980) Macroeconomics and reality Econometrica 48, 1–48.
Sims, C.A (2002) Solving linear rational expectations models Computational Economics 20,
1–20.
Tobin, J (1950) A survey of the theory of rationing Econometrica 26, 24–36.
Trang 36Part I
The Methodology and Philosophy
of Applied Econometrics
Trang 38back-data vitiates analyses of incomplete specifications based on ceteris paribus Instead, the many steps
from the data-generation process (DGP) through the local DGP (LDGP) and general unrestricted model to a specific representation allow an evaluation of the main extant approaches The poten- tial pitfalls confronting empirical research include inadequate theory, data inaccuracy, hidden dependencies, invalid conditioning, inappropriate functional form, non-identification, parameter non-constancy, dependent, heteroskedastic errors, wrong expectations formation, misestimation and incorrect model selection Recent automatic methods help resolve many of these difficulties Suggestions on the teaching of “Applied Econometrics” are followed by revisiting and updating the
“experiment in applied econometrics” and by automatic modeling of a four-dimensional vector autoregression (VAR) with 25 lags for the numbers of bankruptcies and patents, industrial output per capita and real equity prices over 1757–1989.
3
Trang 391.4.3 Data exactitude 30
Looking-[henceforth cited as “Lewis Carroll, 1899”])
Most econometricians feel a bit like Alice did at having to run fast even to stand still.Handbooks are an attempt to alleviate the problem that our discipline moves for-
ward rapidly, and infoglut can overwhelm, albeit that one has to run even faster for
a short period to also find time to read and digest their contents That will require
some sprinting here, given that the contents of this Handbook of Econometrics
pro-vide up-to-date coverage of a vast range of material: time series, cross-sections,panels, and spatial; methodology and philosophy; estimation – parametric andnonparametric – testing, modeling, forecasting and policy; macro, micro, finance,
growth and development; and computing – although I do not see teaching Such
general headings cross-categorize “Applied Econometrics” by types of data andtheir problems on the one hand – time series, cross-sections, panels, high frequency(see, e.g., Barndorff-Nielsen and Shephard, 2007), limited dependent variables (see,e.g., Heckman, 1976), or count data (excellently surveyed by Cameron and Trivedi,1998), etc – and by activities on the other (modeling, theory calibration, theorytesting, policy analysis, forecasting, etc.) The editors considered that I had written
on sufficiently many of these topics during my career to “overview” the volume,without also noting how markedly all of them had changed over that time The
Trang 40main aim of an introductory chapter is often to overview the contents of the
vol-ume, but that is manifestly impossible for the Handbook of Econometrics given its wide and deep coverage In any case, since the Handbook is itself an attempt to
overview Applied Econometrics, such an introduction would be redundant.Thus, my focus on empirical econometric modeling concerns only one of theactivities, but I will also try to present an interpretation of what “Applied Econo-metrics” is; what those who apply econometrics may be trying to achieve, and howthey are doing so; what the key problems confronting such applications are; andhow we might hope to resolve at least some of them Obviously, each aspect is con-ditional on the previous one: those aiming to calibrate a theory model on a claimedset of “stylized facts” are aiming for very different objectives from those doingdata modeling, so how they do so, and what their problems are, naturally differgreatly This chapter will neither offer a comprehensive coverage, nor will it be anuncontroversial survey En route, I will consider why “Applied Econometrics” doesnot have the highest credibility within economics, and why its results are oftenattacked, as in Summers (1991) among many others (see Juselius, 1993, for a reply).Evidence from the contents of textbooks revealing the marginal role of “AppliedEconometrics” and “economic statistics” within the discipline has been providedrecently by Qin (2008) and Atkinson (2008) respectively Since two aspects of ourprofession with even lower status than “Applied Econometrics” are data (measure-ment, collection and preparation), and teaching, I will try and address these as well,
as they are clearly crucial to sustaining and advancing a viable “Applied rics” community Economic forecasting and policy are not addressed explicitly,being uses of empirical models, and because the criteria for building and selectingsuch models differ considerably from those applicable to “modeling for under-standing” (see, e.g., Hendry and Mizon, 2000; and for complete volumes on fore-casting, see Clements and Hendry, 2002a, 2005; Elliott, Granger and Timmermann,2006)
Economet-Economists have long been concerned with the status of estimated empiricalmodels How a model is formulated, estimated, selected and evaluated all affectthat status, as do data quality and the relation of the empirical model to the ini-tial subject-matter theory All aspects have been challenged, with many views stillextant And even how to judge that status is itself debated But current challengesare different from past ones – partly because some of the latter have been success-fully rebutted All empirical approaches face serious problems, yet the story is one
of enormous progress across uncharted terrain with many mountains climbed –but many more to surmount I will recount some of that story, describe roughly
where we are presently located, and peer dimly into the future Why “Applied metrics Through the Looking-Glass”? Lewis Carroll was the pseudonym for Charles
Econo-Dodgson, a mathematician who embodied many insights in the book which iscited throughout the present chapter: a Looking-Glass is a mirror, and appliedfindings in economics can only reflect the underlying reality, so obtaining a robustand reliable reflection should guide its endeavors
Following the brief section 1.2 on the meaning of the topic, section 1.3 marizes some of the history of our fallible discipline Then section 1.4 proposes a