Showing 1 - 10 of 23
We propose new tools for visualizing large numbers of functional data in the form of smooth curves or surfaces. The proposed tools include functional versions of the bagplot and boxplot, and make use of the first two robust principal component scores, Tukey's data depth and highest density...
Persistent link: https://www.econbiz.de/10005427617
In this paper, we focus on expensive multiobjective optimization problems and propose a method to predict an approximation of the Pareto optimal set using classification of sampled decision vectors as dominated or nondominated. The performance of our method, called EPIC, is demonstrated on a set...
Persistent link: https://www.econbiz.de/10010958952
In the classical approach to statistical hypothesis testing the role of the null hypothesis H0 and the alternative H1 is very asymmetric. Power, calculated from the distribution of the test statistic under H1, is treated as a theoretical construct that can be used to guide the choice of an...
Persistent link: https://www.econbiz.de/10008495166
A Kalman filter, suitable for application to a stationary or a non-stationary time series, is proposed. It works on time series with missing values. It can be used on seasonal time series where the associated state space model may not satisfy the traditional observability condition. A new...
Persistent link: https://www.econbiz.de/10005581117
Damped trend exponential smoothing has previously been established as an important forecasting method. Here, it is shown to have close links to simple exponential smoothing with a smoothed error tracking signal. A special case of damped trend exponential smoothing emerges from our analysis, one...
Persistent link: https://www.econbiz.de/10005581148
We propose a new generic method ROPES (Regularized Optimization for Prediction and Estimation with Sparse data) for decomposing, smoothing and forecasting two-dimensional sparse data. In some ways, ROPES is similar to Ridge Regression, the LASSO, Principal Component Analysis (PCA) and...
Persistent link: https://www.econbiz.de/10010958945
This paper is concerned with model selection based on penalized maximized log likelihood function. Its main emphasis is on how these penalities might be chosen in small samples to give good statistical properties.
Persistent link: https://www.econbiz.de/10005087604
The aim of this paper is to examine the measurement of persistence in a range of time series models nested in the framework of Cramer (1961). This framework is a generalization of the Wold (1938) decomposition for stationary time series which, in addition to accommodating the standard I(0) and...
Persistent link: https://www.econbiz.de/10005149028
A Bayesian approach is presented for nonparametric estimation of an additive regression model with autocorrelated errors.
Persistent link: https://www.econbiz.de/10005149033
This paper considers the construction of model selection procedures based on choosing the model with the largest maximised log-likelihood mimus a penalty, when key parameters are restricted to be in a closed interval. The approach adopted is based on King et al.'s (1995) representative models...
Persistent link: https://www.econbiz.de/10005149039