# Anticipating Correlations: A New Paradigm for Risk Management

Robert Engle
Pages: 176
https://www.jstor.org/stable/j.ctt7sb6w

1. Front Matter
(pp. i-iv)
(pp. v-vi)
3. Introduction
(pp. vii-x)

The Econometric Institute Lecture Series deals with topics in econometrics that have important policy implications. The lectures cover a wide range of topics and are not confined to any one area or subdiscipline. Leading international scientists in the fields of econometrics in which applications play a major role are invited to give three-day lectures on a topic to which they have contributed significantly.

The topic of Robert Engle’s lectures deals with the dynamics of correlations between a large number of financial assets. In the present global financial world it is imperative both for asset management and for risk analysis to...

4. 1 Correlation Economics
(pp. 1-14)

Today there are almost three thousand stocks listed on the New York Stock Exchange. NASDAQ lists another three thousand. There is yet another collection of stocks that are unlisted and traded on the Bulletin Board or Pink Sheets. These U.S.-traded stocks are joined by thousands of companies listed on foreign stock exchanges to make up a universe of publicly traded equities. Added to these are the enormous number of government and corporate and municipal bonds that are traded in the United States and around the world, as well as many short-term securities. Investors are now exploring a growing number of...

5. 2 Correlations in Theory
(pp. 15-28)

Correlations measure the linear relationships between two random variables. In this chapter we will discuss a variety of ways to measure correlations in particular settings and will then discuss more general measures of dependence. The standard definition introduced by Pearson emphasizes this linearity. Ifxandyare random variables, the correlation between them is simply

${\rho _{x,y}} = \frac{{E(x - E(x))\,(y - E(y))}}{{\sqrt {E{{(x - E(x))}^2}E{{(y - E(y))}^2}} }}$. (2.1)

Any such correlation must lie between -1 and 1. This measure of correlation is invariant to univariate linear transformations of the two variables. In particular, the correlation between$x{\kern 1pt} * = \alpha + \beta x$and$y{\kern 1pt} * = \gamma + \delta y$will be the same as between y and x. The correlation...

6. 3 Models for Correlation
(pp. 29-42)

Since Engle (1982) introduced the idea that volatility could be modeled and forecast with univariate econometric time series methods, an enormous literature has developed that explores these methods to model multivariate covariance matrices. Probably the first attempts to estimate a multivariate GARCH model were by Engle et al. (1984) and by Bollerslev et al. (1988). The literature has been surveyed by Bollerslev et al. (1994) and by Engle and Mezrich (1996), and more recently by Bauwens et al. (2006) and by Silvennoinen and Terasvirta (2008). However, even before these methods were introduced, the practitioner community had a range of models...

7. 4 Dynamic Conditional Correlation
(pp. 43-58)

There are three general steps in the specification and estimation of a DCC model. First, the volatilities must be estimated to construct standardized residuals or volatility-adjusted returns. This is often called “DEGARCHING” the data. Then the quasi-correlations must be estimated in a dynamic fashion based on these standardized residuals. Finally, the estimated quasi-correlation matrix must be rescaled to ensure that it is a proper correlation matrix. These three steps will be considered in turn in this chapter. The estimation strategy will then be considered as a whole.

The DCC model introduced in the previous chapter formulated the covariance matrix of...

8. 5 DCC Performance
(pp. 59-73)

Our first task is to discuss a Monte Carlo experiment reported in Engle (2002a). In this case the true correlation structure is known. Several estimators can compete to approximate it. The data generating processes are deterministic time series processes that can be approximated by the conditional covariance estimators. The simplest assumptions are that the correlations are constant or that they change only once during the sample period. More complex assumptions allow the correlations to follow sine waves or ramps so that they range from zero to one. These are mean-reverting specifications but they differ in the best approach to adapting...

9. 6 The MacGyver Method
(pp. 74-79)

The problem of estimating correlation matrices for large systems might appear to have been solved in previous chapters. However, there are three reasons to believe that we do not yet have a full solution. First, the evaluation of the log likelihood function requires inversion of matrices,${R_t}$, which are full$n\; \times \;n$matrices, for each observation. To maximize the likelihood function, it is necessary to evaluate the log likelihood for many parameter values and consequently invert a great many$n\; \times \;n$matrices. Convergence is not guaranteed and sometimes it fails or is sensitive to starting values. These numerical problems can surely be...

10. 7 Generalized DCC Models
(pp. 80-87)

The great advantage of DCC models is the parsimony of parameterization. For the simple mean-reverting DCC model with correlation targeting, there are only two unknown parameters in the correlation process, no matter how many variables are being modeled. The integrated DCC only has one and the asymmetric DCC has three. This parsimony is in direct contrast to the general versions of multivariate GARCH, where the number of parameters tends to rise with the square or cube of the number of assets. It has long been believed that it would not be possible to estimate large covariance matrices with multivariate GARCH...

11. 8 FACTOR DCC
(pp. 88-102)

To model large numbers of asset returns, the profession has always turned to factor models. This is a natural development as the investigator seeks to discover the small number of important factors that influence a large number of returns. The Capital Asset Pricing Model (CAPM) of Sharpe (1964) is a one-factor model of all asset returns. It is the primary model behind most financial decision making because it delivers a powerful theory on the pricing and hedging of financial assets. It was then extended to the Arbitrage Pricing Theory (APT) by Ross (1976). These path-breaking frameworks have been the workhorses...

12. 9 Anticipating Correlations
(pp. 103-121)

The goal of this book is to develop methods to anticipate correlations. Thus far, the book has developed descriptions of correlations and why they change as well as models that can track the changes. The task of anticipating correlations seems far more formidable. However, every model that has been presented predicts correlations one period in the future. This short-horizon prediction will be sufficient for many financial applications. The measurement of Value-at-Risk and high-frequency hedging demands may rely on this forecast horizon. The most important feature of short-horizon forecasts is the updating. As new information becomes available, the models will systematically...

13. 10 Credit Risk and Correlations
(pp. 122-129)

One of the most interesting developments in the financial marketplace over the last decade has been the growth in the volume and diversity of credit derivatives. These are contracts that provide insurance against defaults and therefore allow investors to manage their credit risk. The insurance can be purchased on either a single name or on baskets of securities. These allow investors to buy and sell risks associated with defaults either on a particular entity or on a portfolio. As financial markets have globalized, credit derivatives can now be found in portfolios all over the world. The ability to share risks...

14. 11 Econometric Analysis of the DCC Model
(pp. 130-136)

In this chapter we turn to more rigorous econometric topics. The asymptotic properties of estimates of the DCC model are developed. The novel pieces are the analysis of correlation targeting and some alternatives, and the asymptotic distribution of DCC incorporating the two-step methodology and correlation targeting. Much of this material has previously been reported in Engle and Sheppard (2005b) and details of proofs will not be repeated here.

Most multivariate covariance models have a matrix of intercepts that must be estimated and which contains${\textstyle{1 \over 2}}n\,(n - 1)$unknown parameters. For example, consider the simplest scalar multivariate GARCH model given by

${H_t} = \Omega + \alpha {y_{t - 1}}{{y'}_{t - 1}} + \beta {H_{t - 1}}$. (11.1)...

15. 12 Conclusions
(pp. 137-140)

Financial decision making and risk analysis requires accurate and timely estimates of correlations today and how they are likely to evolve in the future. New methods for analyzing correlations have been developed in this book and compared with existing methods. The DCC model and its extensions, such as the FACTOR DCC model, are promising tools. These models are simple and powerful and have been shown to provide good assessment of risk and temporal stability in the face of dramatic shifts in the investment environment.

We began by looking at the typical pattern of correlations across asset classes and countries. The...

16. References
(pp. 141-150)
17. Index
(pp. 151-154)