Macroeconomic analyses and empirical modelling have faced many problems since their failure to predict the Great Recession. A key difficulty is that current theories are now seen to be poor guides to how economies actually function, particularly because economies can experience sudden large shifts, like the Financial Crisis, whereas macroeconomic theories do not allow for the occurrence of major changes.
When poor theory is fitted to data that are not fully accurate, the weak empirical models that result are not reliable, and can lead to economic policies that do not have their expected effects, and forecasts that go awry.
Indeed, the repeated inaccuracy of economic forecasts has led to that activity becoming the butt of many jokes, as well as questions by Queen Elizabeth II as to why no-one foresaw the crisis.
The accompanying picture here shows what happens when an unexpected shift occurs.
The usual assumption in macroeconomic analyses is that the outcome expected today for tomorrow will be accurate, as indeed it would be if shifts did not happen. But they do, in which case today's expected outcome for tomorrow is sadly adrift as shown. Any intelligent economic actor should change their views and their plans when faced with such a shift—but most existing empirical macroeconomic models just plough on as if the change had not occurred, and worse still, assume that the economic actors also fail to adapt. The result is systematic mis-forecasting, and indeed forecast failure when there is a sequence of large deviations between forecasts and outcomes.
Building on several decades of research findings by many international collaborators, to improve empirical macroeconomic modelling in the face of many unanticipated changes, in my latest paper for the International Journal of Forecasting, I propose replacing the current dominant paradigm of using just one of the four main methods of deciding between alternative models (namely theory analyses, empirical evidence, policy relevance, or forecast accuracy), by a new combined approach.
This new method commences with a much larger formulation that allows for shifts as well as additional explanatory variables that might matter when the macroeconomic theory provides an incomplete picture of the economy. While unanticipated shifts are by definition not foreseen, it is crucial to model them once they have happened, which this approach can do. At the same time, more adaptive behaviour of the economic actors can be allowed, as they try to correct their past mistakes.
Importantly, the approach always retains the theory model by embedding it in the larger formulation. A technical adjustment to conventional econometric methods then ensures that when the theory is a good representation, it will be kept unaltered. Thus, compared to the existing paradigm, there are no costs to adopting this new method when the theory is excellent.
However, when the initial theory is inadequate or incomplete—often the case given the complexity of an economy—the approach allows the theory to be evaluated simultaneously against a wide range of alternative specifications, including shifts and possible non-linear responses. When these additional influences actually matter, a better empirical model will be discovered.
Surprisingly, that result holds despite always retaining a theory model even when it is seriously wrong, as the illustration in my response to the discussants demonstrates. Thus, the combined approach of selecting over a very large set of potential explanations will not impugn a really good theory model, yet will protect against imposing a poor theory, which is almost exactly the opposite to the current beliefs of economists about model selection.
This opinion piece reflects the views of the author, and does not necessarily reflect the position of the Oxford Martin School or the University of Oxford. Any errors or omissions are those of the author.