Existing strategies for econometric analysis related to macroeconomics are subject to a number of serious objections, some recently formulated, some old. These objections are summarized in this paper, and it is argued that taken together they make it unlikely that macroeconomic models are in fact over identified, as the existing statistical theory usually assumes. The implications of this conclusion are explored, and an example of econometric work in a non-standard style, taking account of the objections to the standard style, is presented.THE STUDY OF THE BUSINESS cycle, fluctuations in aggregate measures of economic activity and prices over periods from one to ten years or so, constitutes or motivates a large part of what we call macroeconomics. Most economists would agree that there are many macroeconomic variables whose cyclical fluctuations are of interest, and would agree further that fluctuations in these series are interrelated. It would seem to follow almost tautologically that statistical models involving large numbers of macroeconomic variables ought to be the arena within which macroeconomic theories confront reality and thereby each other.Instead, though large-scale statistical macroeconomic models exist and are by some criteria successful, a deep vein of skepticism about the value of these models runs through that part of the economics profession not actively engaged in constructing or using them. It is still rare for empirical research in macroeconomics to be planned and executed within the framework of one of the large models. In this lecture I intend to discuss some aspects of this situation, attempting both to offer some explanations and to suggest some means for improvement.I will argue that the style in which their builders construct claims for a connection between these models and reality-the style in which "identification" is achieved for these models-is inappropriate, to the point at which claims for identification in these models cannot be taken seriously. This is a venerable assertion; and there are some good old reasons for believing it;2 but there are also some reasons which have been more recently put forth. After developing the conclusion that the identification claimed for existing large-scale models is incredible, I will discuss what ought to be done in consequence. The line of argument is: large-scale models do perform useful forecasting and policy-analysis functions despite their incredible identification; the restrictions imposed in the usual style of identification are neither essential to constructing a model which can perform these functions nor innocuous; an alternative style of identification is available and practical.Finally we will look at some empirical work based on an alternative style of macroeconometrics. A six-variable dynamic system is estimated without using ' Research for this paper was supported by NSF Grant Soc-76-02482. Lars Hansen executed the computations. The paper has benefited from comments by many people, especially Thomas J. Sargent and Finn Kydland.T....
Abstract. A constraint that actions can depend on observations only through a communication channel with finite Shannon capacity is shown to be able to play a role very similar to that of a signal extraction problem or an adjustment cost in standard control problems. The resulting theory looks enough like familiar dynamic rational expectations theories to suggest that it might be useful and practical, while the implications for policy are different enough to be interesting.
INFERENCE IN LINEAR TIME SERIES MODELS WITH SOME UNIT ROOTS BY CHRISTOPHER A. SIMS, JAMWS H. STOCK, AND MARK W. WATSON1 This paper considers estimation and hypothesis testing in linear time series models when some or all of the variables have unit roots. Our motivating example is a vector autoregression with some unit roots in the companion matrix, which might include polynomials in time as regressors. In the general formulation, the variable might be integrated or cointegrated of arbitrary orders, and might have drifts as well. We show that parameters that can be written as coefficients on mean zero, nonintegrated regressors have jointly normal asymptotic distributions, converging at the rate T'/2. In general, the other coefficients (including the coefficients on polynomials in time) will have nonnormal asymptotic distributions. The results provide a formal characterization of which t or F tests-such as Granger causality tests-will be asymptotically valid, and which will have nonstandard limiting distributions.
A multivariate regime-switching model for monetary policy is confronted with U.S. data. The best fit allows time variation in disturbance variances only. With coefficients allowed to change, the best fit is with change only in the monetary policy rule and there are three estimated regimes corresponding roughly to periods when most observers believe that monetary policy actually differed. But the differences among regimes are not large enough to account for the rise, then decline, in inflation of the 1970s and 1980s. Our estimates imply monetary targeting was central in the early 1980s, but also important sporadically in the 1970s.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.