Showing 1 - 9 of 9
This paper considers Bayesian regression with normal and doubleexponential priors as forecasting methods based on large panels of time series. We show that, empirically, these forecasts are highly correlated with principal component forecasts and that they perform equally well for a wide range...
Persistent link: https://www.econbiz.de/10005816205
This paper considers quasi-maximum likelihood estimations of a dynamic approximate factor model when the panel of time series is large. Maximum likelihood is analyzed under different sources of misspecification: omitted serial correlation of the observations and cross-sectional correlation of...
Persistent link: https://www.econbiz.de/10005344907
The causes of the 2008 collapse and subsequent surge in global capital flows remain an open and highly controversial issue. Employing a factor model coupled with a dataset of high-frequency portfolio capital flows to 50 economies, the paper finds that common shocks – key crisis events as well...
Persistent link: https://www.econbiz.de/10009216681
Using the 2007-2009 financial crisis as a laboratory, we analyze the transmission of crises to country-industry equity portfolios in 55 countries. We use an asset pricing framework with global and local factors to predict crisis returns, defining unexplained increases in factor loadings as...
Persistent link: https://www.econbiz.de/10009293721
This paper uses a factor-augmented vector autoregressive model (FAVAR) estimated on U.S. data in order to analyze monetary transmission via private sector balance sheets, credit risk spreads and asset markets in an integrated setup and to explore the role of monetary policy in the three...
Persistent link: https://www.econbiz.de/10008476125
In this paper we compare alternative approaches for the construction of time series of macroeconomic variables for Unified Germany prior to 1991, and then use them for the construction of corresponding time series for the euro area. The resulting series for Germany and the euro area are compared...
Persistent link: https://www.econbiz.de/10005530861
Existing methods for data interpolation or backdating are either univariate or based on a very limited number of series, due to data and computing constraints that were binding until the recent past. Nowadays large datasets are readily available, and models with hundreds of parameters are fastly...
Persistent link: https://www.econbiz.de/10005530985
We define nowcasting as the prediction of the present, the very near future and the very recent past. Crucial in this process is to use timely monthly information in order to nowcast key economic variables, such as e.g. GDP, that are typically collected at low frequency and published with long...
Persistent link: https://www.econbiz.de/10008752568
This paper formalizes the process of updating the nowcast and forecast on output and inflation as new releases of data become available. The marginal contribution of a particular release for the value of the signal and its precision is evaluated by computing "news" on the basis of an evolving...
Persistent link: https://www.econbiz.de/10005816232