Showing 1 - 10 of 1,149
The estimation of the holding periods of financial products has to be done in a dynamic process in which the size of the observation time interval influences the result. Small intervals will produce smaller average holding periods than bigger ones. The approach developed in this paper offers the...
Persistent link: https://www.econbiz.de/10011890392
Persistent link: https://www.econbiz.de/10009581671
This paper introduces a new generalization of the Pareto distribution using the MarshallOlkin generator and the method of alpha power transformation. This new model has several desirable properties appropriate for modelling right skewed data. The Authors demonstrate how the hazard rate function...
Persistent link: https://www.econbiz.de/10012655743
The estimation of the holding periods of financial products has to be done in a dynamic process in which the size of the observation time interval influences the result. Small intervals will produce smaller average holding periods than bigger ones. The approach developed in this paper offers the...
Persistent link: https://www.econbiz.de/10011966824
The Anderson-Darling goodness-of-fit test has a highly skewed and non-standard limit distribution. Various attempts have been made to tabulate the associated critical points, using both theoretical approximations and simulation methods. We show that a standard saddlepoint approximation performs...
Persistent link: https://www.econbiz.de/10005839156
Scanner data for fast moving consumer goods typically amount to panels of time series where both N and T are large. To reduce the number of parameters and to shrink parameters towards plausible and interpretable values, multi-level models turn out to be useful. Such models contain in the second...
Persistent link: https://www.econbiz.de/10010837954
In this paper, we consider a simple preliminary-test estimation problem where the analyst's loss structure is represented by a ‘reflected Normal' penalty function. In particular we consider the estimation of the location parameter in a Normal sampling problem, where a preliminary test is...
Persistent link: https://www.econbiz.de/10005260593
Scanner data for fast moving consumer goods typically amount to panels of time series where both N and T are large. To reduce the number of parameters and to shrink parameters towards plausible and interpretable values, multi-level models turn out to be useful. Such models contain in the second...
Persistent link: https://www.econbiz.de/10004991091
Stochastic volatility models present a natural way of working with time-varying volatility. However the difficulty involved in estimating these types of models has prevented their wide-spread use in empirical applications. In this paper we exploit Gibbs sampling to provide a likelihood framework...
Persistent link: https://www.econbiz.de/10005730327
This paper provides methods for carrying out likelihood based inference for diffusion driven models, for example discretely observed multivariate diffusions, continuous time stochastic volatility models and counting process models. The diffusions can potentially be non-stationary. Although our...
Persistent link: https://www.econbiz.de/10005730357