Showing 1 - 10 of 93
We propose a new generic method ROPES (Regularized Optimization for Prediction and Estimation with Sparse data) for decomposing, smoothing and forecasting two-dimensional sparse data. In some ways, ROPES is similar to Ridge Regression, the LASSO, Principal Component Analysis (PCA) and...
Persistent link: https://www.econbiz.de/10010958945
A Bayesian approach to option pricing is presented, in which posterior inference about the underlying returns process …
Persistent link: https://www.econbiz.de/10005427634
A new approach to inference in state space models is proposed, based on approximate Bayesian computation (ABC). ABC … simulated from the true process; exact inference being feasible only if the statistics are sufficient. With finite sample … variable is driven by a continuous time process, with exact inference typically infeasible in this case as a result of …
Persistent link: https://www.econbiz.de/10010958938
integrated volatility and price jumps, to the specified model components; with Bayesian inference conducted using a Markov chain …Dynamic jumps in the price and volatility of an asset are modelled using a joint Hawkes process in conjunction with a …
Persistent link: https://www.econbiz.de/10011141014
The object of this paper is to produce distributional forecasts of physical volatility and its associated risk premia using a non-Gaussian, non-linear state space approach. Option and spot market information on the unobserved variance process is captured by using dual 'model-free' variance...
Persistent link: https://www.econbiz.de/10008763558
In this paper, we introduce a new class of bivariate threshold VAR cointegration models. In the models, outside a compact region, the processes are cointegrated, while in the compact region, we allow different kinds of possibilities. We show that the bivariate processes from a 1/2-null recurrent...
Persistent link: https://www.econbiz.de/10011193729
This paper is concerned with model selection based on penalized maximized log likelihood function. Its main emphasis is on how these penalities might be chosen in small samples to give good statistical properties.
Persistent link: https://www.econbiz.de/10005087604
The aim of this paper is to examine the measurement of persistence in a range of time series models nested in the framework of Cramer (1961). This framework is a generalization of the Wold (1938) decomposition for stationary time series which, in addition to accommodating the standard I(0) and...
Persistent link: https://www.econbiz.de/10005149028
A Bayesian approach is presented for nonparametric estimation of an additive regression model with autocorrelated errors.
Persistent link: https://www.econbiz.de/10005149033
This paper considers the construction of model selection procedures based on choosing the model with the largest maximised log-likelihood mimus a penalty, when key parameters are restricted to be in a closed interval. The approach adopted is based on King et al.'s (1995) representative models...
Persistent link: https://www.econbiz.de/10005149039