Showing 1 - 10 of 106
Most dimension reduction methods based on nonparametric smoothing are highly sensitive to outliers and to data coming from heavy tailed distributions. We show that the recently proposed MAVE and OPG methods by Xia et al. (2002) allow us to make them robust in a relatively straightforward way...
Persistent link: https://www.econbiz.de/10010296438
Sliced inverse regression (SIR) is a clever technique for reducing the dimension of the predictor in regression problems, thus avoiding the curse of dimensionality. There exist many contributions on various aspects of the performance of SIR. Up to now, few attention has been paid to the problem...
Persistent link: https://www.econbiz.de/10010298194
Factor construction methods are widely used to summarize a large panel of variables by means of a relatively small number of representative factors. We propose a novel factor construction procedure that enjoys the properties of robustness to outliers and of sparsity; that is, having relatively...
Persistent link: https://www.econbiz.de/10010326490
This paper compares the goods and characteristics models of the consumer within a non-parametric revealed preference framework. Of primary interest is to make a comparison on the basis of predictive success that takes into account dimension reduction. This allows us to nonparametrically identify...
Persistent link: https://www.econbiz.de/10010331043
Random subspace methods are a novel approach to obtain accurate forecasts in high-dimensional regression settings. We provide a theoretical justification of the use of random subspace methods and show their usefulness when forecasting monthly macroeconomic variables. We focus on two approaches....
Persistent link: https://www.econbiz.de/10011586688
Abstract Equity basket correlation can be estimated both using the physical measure from stock prices, and also using the risk neutral measure from option prices. The difference between the two estimates motivates a so-called “dispersion strategy”. We study the performance of this strategy...
Persistent link: https://www.econbiz.de/10014621231
Abstract Many statistical estimation techniques for high-dimensional or functional data are based on a preliminary dimension reduction step, which consists in projecting the sample X 1 ,..., X n onto the first D eigenvectors of the Principal Component Analysis (PCA) associated with the empirical...
Persistent link: https://www.econbiz.de/10014622217
In the context of binary classification with continuous predictors, we proove two properties concerning the connections between Partial Least Squares (PLS) dimension reduction and between-group PCA, and between linear discriminant analysis and between-group PCA. Such methods are of great...
Persistent link: https://www.econbiz.de/10010266208
In this paper we extend the standard approach of correlation structure analysis in order to reduce the dimension of highdimensional statistical data. The classical assumption of a linear model for the distribution of a random vector is replaced by the weaker assumption of a model for the copula....
Persistent link: https://www.econbiz.de/10010266229
Sliced Inverse Regression is a method for reducing the dimension of the explanatory variables x in non-parametric regression problems. Li (1991) discussed a version of this method which begins with a partition of the range of y into slices so that the conditional covariance matrix of x given y...
Persistent link: https://www.econbiz.de/10009471478