dgiannon's Site

Nowcasting

Nowcasting is defined as the prediction of the very recent past, the present, and the very near future. The term is a contraction of “now” and “forecasting” and has been used in meteorology for a long time. I introduced it into economics 20 years ago with a team of coauthors. Below is a short description of the scientific problem and my journey in this fascinating topic.

In economics, we need to forecast the present since reliable and comprehensive measures of the state of the economy are often released with a substantial delay and considerable measurement error. The basic principle of nowcasting is the exploitation of timely information, possibly available at a higher frequency than the variable of interest, in order to obtain an “early estimate” before the official figures become available.

Obtaining a reliable measure of the state of the economy is pivotal to making policy and business decisions. Every day, industry analysts, policy institutions, and investment banks parse troves of economic data released by statistical agencies, private and public surveys, and other sources, to assess the health of the economy. Based on those findings, they create a narrative about where the economy is and where it is headed. The difficulty comes in separating meaningful signals from the noise.

I designed nowcasting with the objective of formalizing key features of how market participants and policymakers read data in real time. This involves monitoring data releases, forming expectations about them, and revising the assessment on the state of the economy whenever realizations diverge sizably from those expectations.

Twenty years ago, monitoring macroeconomic conditions in real time was more of an art than a science. The practice was to use a set of heuristic models and a good dose of judgment to make predictions about the state of the economy. The whole process is a big data effort: It involves analyzing the large amount of complex information that is continually released. Often there are multiple releases in a single day. Updating the information in real time using a procedure that is not entirely automated is costly and risky. Judgmental and simplified heuristic procedures are exposed to internal inconsistencies, with the constant risk of putting too much weight on outdated signals coming from high-quality releases, or on data releases that are timely but unreliable. Moreover, processes that are not scientific and do not use formal methods cannot be replicated and evaluated ex-post.

My challenge was to design an entirely automated platform to track the state of the economy without relying on any judgment or subjective prior information. I developed a formal and internally coherent methodology automatizing expert knowledge to process the multiple streams of data and to handle the specific characteristics of forecasting. It was important to understand both the tedious aspects and the complexity of the problem before I could simplify the entire process through a scientific approach. To perform this task, I deployed an arsenal of tools and methods in econometrics, statistics, and data analysis, building upon the nascent developments and insights in big data analytics, and taking advantage of improvements in scientific computing, data handling, and visualization.

In order to minimize human intervention and subjective choices, I designed a platform based on a unified and internally coherent econometric approach, which is simple, transparent, hence robust. This straitjacket has not created any disadvantage; the benefit, apart from robustness, is that it allows for a coherent analysis of the link between macroeconomic news and cyclical developments.

The engine of the platform is a dynamic factor model equipped with efficient filtering techniques. The modeling approach exploits an essential and robust feature of business cycle fluctuations: macroeconomic data, although numerous and complex, comove quite strongly so that a few common factors can capture their dynamics. In this context, a dynamic factor model provides a parsimonious yet suitable representation for the large set of macroeconomic time series.

Dynamic factor models can be written as a system with two types of equations: measurement equations linking observed series to a latent state process, and transition equations describing the state process dynamics. The latent process is typically associated with the unobserved state of the economy. The state space representation allows using Kalman filtering techniques to obtain projections for the observed and the state variables. Most importantly, the Kalman filter can easily cope with challenging features of the now-casting information set, such as different numbers of missing data across series at the end of the sample due to asynchronous data releases (ragged edges), missing data at the beginning of the sample due to only a recent collection of some data sources, and the data observed at different frequencies.

The use of the dynamic factor models coupled with the Kalman filter has a long tradition in econometrics. However, it was considered infeasible for high-dimensional data because it requires estimating too many parameters. With a team of coauthors, we challenged this view and showed that the approach is suitable for big data. We established the viability from a statistical perspective by studying the asymptotic properties of the maximum likelihood estimates when the complexity of the model as well as the sample size increases. These results were confirmed by extensive backtesting and statistical evaluation of the out-of-sample predictive accuracy. We also refined the estimation procedure to make the computation scalable to high-dimensional problems.

I developed the nowcasting model while working on my Ph.D. The model was first implemented at the Board of Governors of the Federal Reserve in a project started in 2003 and later at the European Central Bank. At every step of the process, I engaged with experts, talking with them about what they did and why. Their comments helped identify the most important data and relevant features that needed to be captured.

When I left the European Central Bank for academia, I teamed up with two economists and one person with experience in consulting and software development. We saw a need for nowcasting within the larger financial industry and founded our own company, now-casting.com. The platform we developed pushed the idea of automation to the limits: When new information becomes available, the new data is processed within seconds from the releases, and the results are immediately visualized on the webpage and sent by email to customers.

As the company started selling subscriptions for the service, it opened the door to talking directly with customers. We could focus on the specific needs of our users, constantly perfecting the model to provide transparent and intuitive interpretation. One of the most exciting developments was to exploit the completely automated nature of the nowcast and integrate it into an algorithmic trading platform.

By taking the best practices from experts and codifying this knowledge, I challenged the view that expert judgment could not be automated. More than a decade of accumulated experience has shown that the model provides outcomes that equal or exceed the accuracy of using expert judgment alone. What Moneyball did for baseball, nowcasting does for monitoring the state of the economy.

I am proud to have developed new solutions for monitoring economic conditions in real time—providing policymakers, investors and businesses with a powerful framework to understand, process, and interpret the flow of macroeconomic data. Today, almost every central bank in the world has a nowcasting model. The New York Fed, the Atlanta Fed, and the Cleveland Fed periodically publish their nowcasting models’ results. These estimates are widely followed and discussed by analysts and the press. Hedge funds, investment banks, and large corporations are interested in the nowcasting and have started listing it as an area of expertise in job posts. Nowcasting has also become an active area of academic research. In a recent survey article in the Journal of Economic Perspectives, James Stock and Mark Watson included nowcasting among the ten most important innovations in time series econometrics over the last 20 years.