Showing 31 - 40 of 700,986
We introduce a Combined Density Nowcasting (CDN) approach to Dynamic Factor Models (DFM) that in a coherent way accounts for time-varying uncertainty of several model and data features in order to provide more accurate and complete density nowcasts. The combination weights are latent random...
Persistent link: https://www.econbiz.de/10013040417
Most macroeconomic indicators failed to capture the sharp economic fluctuations during the Corona crisis in a timely manner. Instead, alternative high-frequency data have been used, aiming to monitor the economic situation. However, these data are often only loosely related to the business cycle...
Persistent link: https://www.econbiz.de/10012395297
This study considers Bayesian variable selection in the Phillips curve context by using the Bernoulli approach of Korobilis (2013a). The Bernoulli model, however, is unable to account for model change over time, which is important if the set of relevant predictors changes over time. To tackle...
Persistent link: https://www.econbiz.de/10011720713
This paper compares within-sample and out-of-sample fit of a DSGE model with rational expectations to a model with adaptive learning. The Galí, Smets and Wouters model is the chosen laboratory using quarterly real-time euro area data vintages, covering 2001Q1-2019Q4. The adaptive learning model...
Persistent link: https://www.econbiz.de/10013492913
This paper compares within-sample and out-of-sample fit of a DSGE model with rational expectations to a model with adaptive learning. The Galí, Smets and Wouters model is the chosen laboratory using quarterly real-time euro area data vintages, covering 2001Q1–2019Q4. The adaptive learning...
Persistent link: https://www.econbiz.de/10014258211
The severity function approach (abbreviated SFA) is a method of selecting adverse scenarios from a multivariate density. It requires the scenario user (e.g. an agency that runs banking sector stress tests) to specify a "severity function", which maps candidate scenarios into a scalar severity...
Persistent link: https://www.econbiz.de/10011755965
This paper addresses the issue of improving the forecasting performance of vector autoregressions (VARs) when the set of available predictors is inconveniently large to handle with methods and diagnostics used in traditional small-scale models. First, available information from a large dataset...
Persistent link: https://www.econbiz.de/10014215970
We consider forecast combination and, indirectly, model selection for VAR models when there is uncertainty about which variables to include in the model in addition to the forecast variables. The key difference from traditional Bayesian variable selection is that we also allow for uncertainty...
Persistent link: https://www.econbiz.de/10014221496
This paper uses a simple New Keynesian monetary DSGE model as a prior for a vector autoregression and shows that the resulting model is competitive with standard benchmarks in terms of forecasting and can be used for policy analysis
Persistent link: https://www.econbiz.de/10014048878
This paper uses a simple New-Keynesian monetary DSGE model as a prior for a VAR, shows that the resulting model is competitive with standard benchmarks in terms of forecasting, and can be used for policy analysis
Persistent link: https://www.econbiz.de/10014112365