# Structural Macroeconometrics: (Second Edition)

David N. DeJong
Chetan Dave
Pages: 428
https://www.jstor.org/stable/j.ctt7srm7

1. Front Matter
(pp. i-vi)
(pp. vii-xii)
3. Preface
(pp. xiii-xiv)
Dave DeJong and Chetan Dave
4. Preface to the First Edition
(pp. xv-xvi)
5. ### Part I Introduction

• Chapter 1 Background and Overview
(pp. 3-8)

The seminal contribution of Kydland and Prescott (1982) marked a sea change in the way macroeconomists conduct empirical research. Under the empirical paradigm that remained predominant at the time, the focus was either on purely statistical (or reduced-form) characterizations of macroeconomic behavior, or on systems-of-equations models that ignored both general-equilibrium considerations and forward-looking behavior on the part of purposeful decision makers. But the powerful criticism of this approach set forth by Lucas (1976), and the methodological contributions of, for example, Sims (1972) and Hansen and Sargent (1980), sparked a transition to a new empirical paradigm. In this transitional stage, the...

• Chapter 2 Casting Models in Canonical Form
(pp. 9-17)

A common set of notations is used throughout the text in presenting models and empirical methodologies. A summary of it follows. The complete set of variables associated with a particular model is represented by the$n\hspace{3pt}\times\hspace{3pt}1$vector of stationary variables$z_{t}$, the$i^{\text{th}}$component of which is given by$z_{it}$. Among other things, stationarity implies that the unconditional expectation of the level of$z_{t}$does not depend on time, and thus the individual elements$z_{it}$show no tendency towards long-term growth. For a given model, this may necessitate the normalization of trending variables prior to analysis. Specific examples...

• Chapter 3 DSGE Models: Three Examples
(pp. 18-48)

Chapter 2 introduced the generic representation of a DSGE model as

$\Gamma\left$$E_{t}z_{t+1},\hspace{2pt}z_{t},\hspace{2pt}v_{t+1}\right$$\hspace{3pt}=\hspace{3pt}0,$

and provided background for mapping the model into a log-linear representation of the form

$Ax_{t+1}\hspace{3pt}=\hspace{3pt}Bx_{t}\hspace{3pt}+\hspace{3pt}Cv_{t+1}\hspace{3pt}+\hspace{3pt}D \eta_{t+1}.$

This chapter demonstrates the representation and completion of this mapping for three prototypical model environments that serve as examples throughout the remainder of the text. This sets the stage for chapter 4, which details the derivation of linear model solutions of the form

$x_{t+1}\hspace{3pt}=\hspace{3pt}Fx_{t}\hspace{3pt}+\hspace{3pt}Gv_{t+1},$

and chapter 5, which details the derivation of nonlinear solutions of the form

$c_{t}\hspace{3pt}=\hspace{3pt}c\left$$s_{t}\right$$,$

$s_{t}\hspace{3pt}=\hspace{3pt}f\left$$s_{t-1},\hspace{2pt}v_{t}\right$$.$

For guidance regarding the completion of log-linear mappings for a far broader range...

6. ### Part II Model Solution Techniques

• Chapter 4 Linear Solution Techniques
(pp. 51-68)

Chapters 2 and 3 demonstrated the mapping of DSGE models from nonlinear first-order systems of expectational difference equations of the form

$\Gamma\left$$E_{t}z_{t+1},\hspace{2pt}z_{t},\hspace{2pt}v_{t+1}\right$$\hspace{3pt}=\hspace{3pt}0(4.1)$

into linear or log-linear representations of the form

$Ax_{t+1}\hspace{3pt}=\hspace{3pt}Bx_{t}\hspace{3pt}+\hspace{3pt}Cv_{t+1}\hspace{3pt}+\hspace{3pt}D \eta_{t+1}.(4.2)$

The purpose of this chapter is to characterize the derivation of solutions of (4.2), expressed as

$x_{t+1}\hspace{3pt}=\hspace{3pt}F\left$$\mu\right$$x_{t}\hspace{3pt}+\hspace{3pt}G\left$$\mu\right$$v_{t+1}.(4.3)$

Absent stochastic innovations, linear systems admit three types of dynamic behavior in general. Systems are said to be asymptotically stable if for any given initial value of the state variables$s_{0}$, the system will converge to$\overline{x}$given any choice of control variables$c_{0}$satisfying feasibility constraints implicit in (4.2). Systems...

• Chapter 5 Nonlinear Solution Techniques
(pp. 69-110)

There are two generic reasons for moving beyond linear or log-linear model representations and solutions. First, for the purpose of the task at hand, the enhanced accuracy associated with nonlinear model representations may be important. For example, Fernandez-Villaverde and Rubio-Ramirez (2005) have emphasized the importance of preserving model nonlinearities in conducting likelihood-based analyses. Second, it is often the case that departures from linearity are of direct interest in light of the question under investigation. This is the case, for example, in examining the role of precautionary motives in driving consumption/savings decisions (Hugget and Ospina, 2001); or in measuring the relative...

7. ### Part III Data Preparation and Representation

• Chapter 6 Removing Trends and Isolating Cycles
(pp. 113-137)

Just as DSGE models must be primed for empirical analysis, so too must the corresponding data. Broadly speaking, data preparation involves three steps. A guiding principle behind all three involves the symmetric treatment of the actual data and their theoretical counterparts. First, correspondence must be established between what is being characterized by the model and what is being measured in the data. For example, if the focus is on a business cycle model that does not include a government sector, it would not be appropriate to align the model’s characterization of output with the measure of aggregate GDP reported in...

• Chapter 7 Summarizing Time Series Behavior When All Variables Are Observable
(pp. 138-165)

Empirical implementations of DSGE models often entail the comparison of theoretical predictions regarding collections of summary statistics with empirical counterparts. Such comparisons are broadly characterized as limited-information analyses, and encompass as special cases calibration exercises (the subject of chapter 11) and moment-matching exercises (the subject of chapter 12).

Here we present two workhorse reduced-form models that provide flexible characterizations of the time-series behavior of the$m\hspace{3pt}\times\hspace{3pt}1$collection of observable variables$X_{t}$. We then present a collection of summary statistics that frequently serve as targets for estimating the parameters of structural models, and as benchmarks for judging their empirical performance....

• Chapter 8 State-Space Representations
(pp. 166-190)

State-space representations are highly flexible models that characterize the dynamic interaction of a collection of time-series variables, a subset of which is unobservable. The observable series may either be observed directly, or subject to measurement error. The flexibility of these models has led to their use in an extensive range of applications in multiple disciplines. For textbook characterizations of the application of state-space representations in disciplines including engineering, environmental biology, meteorology, and physics; and in problems including the detection of solar systems, the design of control systems, and the tracking of non-maneuvering, maneuvering, and stealthy targets, for example, see Doucet,...

8. ### Part IV Monte Carlo Methods

• Chapter 9 Monte Carlo Integration: The Basics
(pp. 193-220)

In pursuing the empirical implementation of DSGE models, the need to calculate integrals arises in four distinct contexts, two of which we have already encountered. First, recall from chapter 5 that in obtaining nonlinear approximations of policy functions mapping state to control variables, the approximations we seek using projection methods are derived as the solution of nonlinear expectational difference equations. Generically, with a DSGE model represented as

$\Gamma\left$$E_{t}z_{t+1},\hspace{2pt}z_{t},\hspace{2pt}v_{t+1}\right$$\hspace{3pt}=\hspace{3pt}0, (9.1)$

the solution we seek is of the form

$c_{t}\hspace{3pt}=\hspace{3pt}c\left$$s_{t}\right$$, (9.2)$

$s_{t}\hspace{3pt}=\hspace{3pt}f\left$$s_{t-1},\hspace{2pt}v_{t}\right$$; (9.3)$

the policy function$c\left$$s_{t}\right$$$is derived as the solution to

$F\left$$c\left\(s\right$$\right\)\hspace{3pt}=\hspace{3pt}0, (9.4)$

where$F\left$$\cdot\right$$$is an operator defined over function spaces. With...

• Chapter 10 Likelihood Evaluation and Filtering in State-Space Representations Using Sequential Monte Carlo Methods
(pp. 221-250)

Chapter 8 demonstrated the characterization of DSGE models as state-space representations. This characterization facilitates pursuit of the dual objectives of likelihood evaluation and filtering (i.e., inferring the time-series behavior of the model’s unobservable state variables, conditional on the observables). Both objectives entail the calculation of integrals over the unobservable state variables.

As we have seen, when models are linear and stochastic processes are Normally distributed, required integrals can be calculated analytically via the Kalman filter. Departures entail integrals that must be approximated numerically. Here we characterize methods for approximating such integrals; the methods fall under the general heading of sequential...

9. ### Part V Empirical Methods

• Chapter 11 Calibration
(pp. 253-284)

With their seminal analysis of business cycles, Kydland and Prescott (1982) capped a paradigm shift in the conduct of empirical work in macroeconomics. They did so using a methodology that enabled them to cast the DSGE model they analyzed as the centerpiece of their empirical analysis. The analysis contributed towards the Nobel Prize in Economics they received in 2004, and the methodology has come to be known as a calibration exercise.¹ Calibration not only remains a popular tool for analyzing DSGEs, but has also served as the building block for subsequent methodologies developed towards this end. Thus it provides a...

• Chapter 12 Matching Moments
(pp. 285-313)

In the previous chapter, we characterized calibration as an exercise under which a set of empirical targets is used to pin down the parameters of the model under investigation, and a second set of targets is used to judge the model’s empirical performance. Here we present a collection of procedures that establish a statistical foundation upon which structural models can be parameterized and evaluated. Under these procedures, parameterization is accomplished via estimation, and empirical performance is assessed via hypothesis testing.

As with calibration, the focus of these procedures remains on a set of empirical targets chosen by the researcher. Thus...

• Chapter 13 Maximum Likelihood
(pp. 314-350)

Chapters 11 and 12 presented limited-information methods for evaluating DSGE models. This chapter and the next present full-information alternatives: the classical (chapter 13) and Bayesian (chapter 14) approaches to likelihood analysis. Under these methods, the DSGE model under investigation is viewed as providing a complete statistical characterization of the data, in the form of a likelihood function. As described in chapters 8 and 10, the source of the likelihood function is the corresponding state-space representation of the observable data, coupled with a distributional assumption for the stochastic innovations included in the model.

Two features of full-information analyses distinguish them from...

• Chapter 14 Bayesian Methods
(pp. 351-386)

Chapter 13 described the pursuit of empirical objectives using classical estimation methods. This chapter characterizes and demonstrates an alternative approach to this pursuit: the use of Bayesian methods.

A distinct advantage in using structural models to conduct empirical research is that a priori guidance concerning their parameterization is often much more readily available than is the case in working with reduced-form specifications. The adoption of a Bayesian statistical perspective in this context is therefore particularly attractive, because it facilitates the formal incorporation of prior information in a straightforward manner.

The reason is that from a Bayesian perspective, parameters are interpreted...

10. References
(pp. 387-400)
11. Index
(pp. 401-418)