Monetary policy and data uncertainty
One of the problems facing policymakers is that recent releases of data are liable to subsequent revisions. This paper discusses how to deal with this, and is in two parts. In the normative part of the paper, we study the design of monetary policy rules in a model that has the feature that data uncertainty varies according to the vintage. We show how coefficients on lagged variables in optimised simple rules for monetary policy increase as the relative measurement error in early vintages of data increases. We also explore scenarios when policymakers are uncertain by how much measurement error in new data exceeds that in old data. An optimal policy can then be one in which it is better to assume that the ratio of measurement error in new compared to old data is larger, rather than smaller. In the positive part of the paper, we show that the response of monetary policy to vintage varying data uncertainty may generate evidence of apparent interest rate smoothing in interest rate reaction functions: but we suggest that it may not generate enough to account for what has been observed in the data.
Year of publication: |
2005-11
|
---|---|
Authors: | Jääskelä, Jarkko ; Yates, Tony |
Institutions: | Bank of England |
Saved in:
freely available
Saved in favorites
Similar items by person
-
Misperceptions and monetary policy in a New Keynesian model
Jääskelä, Jarkko, (2005)
-
Monetary policy and private sector misperceptions about the natural level of output
Jääskelä, Jarkko, (2005)
-
Fundamental inflation uncertainty
Groth, Charlotta, (2006)
- More ...