Showing 1 - 10 of 127
Sliced Inverse Regression is a method for reducing the dimension of the explanatory variables x in non-parametric regression problems. Li (1991) discussed a version of this method which begins with a partition of the range of y into slices so that the conditional covariance matrix of x given y...
Persistent link: https://www.econbiz.de/10009471478
To overcome the curse of dimensionality, dimension reduction is important andnecessary for understanding the underlying phenomena in a variety of fields.Dimension reduction is the transformation of high-dimensional data into ameaningful representation in the low-dimensional space. It can be...
Persistent link: https://www.econbiz.de/10009475737
Sparse non-Gaussian component analysis (SNGCA) is an unsupervised method of extracting a linear structure from a high dimensional data based on estimating a low-dimensional non-Gaussian data component. In this paper we discuss a new approach to direct estimation of the projector on the target...
Persistent link: https://www.econbiz.de/10010281511
Let a high-dimensional random vector X can be represented as a sum of two components - a signal S , which belongs to some low-dimensional subspace S, and a noise component N . This paper presents a new approach for estimating the subspace S based on the ideas of the Non-Gaussian Component...
Persistent link: https://www.econbiz.de/10010281568
Modelling covariance structures is known to suffer from the curse of dimensionality. In order to avoid this problem for forecasting, the authors propose a new factor multivariate stochastic volatility (fMSV) model for realized covariance measures that accommodates asymmetry and long memory....
Persistent link: https://www.econbiz.de/10010377197
In the context of binary classification with continuous predictors, we proove two properties concerning the connections between Partial Least Squares (PLS) dimension reduction and between-group PCA, and between linear discriminant analysis and between-group PCA. Such methods are of great...
Persistent link: https://www.econbiz.de/10010266208
In this paper we extend the standard approach of correlation structure analysis in order to reduce the dimension of highdimensional statistical data. The classical assumption of a linear model for the distribution of a random vector is replaced by the weaker assumption of a model for the copula....
Persistent link: https://www.econbiz.de/10010266229
The basic ideas of Desirability functions and indices are introduced and compared to other methods of multivariate optimisation. It is shown that gradient based techniques are not in general appropriate to perform the numerical optimisation for Desirability indices. The problems are shown for...
Persistent link: https://www.econbiz.de/10010316514
Sliced Inverse Regression (SIR) is a promising technique for the purpose of dimension reduction. Several properties of this relatively new method have been examined already, but little attention has been paid to robustness aspects. We show that SIR is very sensitive towards outliers in the data....
Persistent link: https://www.econbiz.de/10010316531
We propose multivariate classification as a statistical tool to describe business cycles. These cycles are often analyzed as a univariate phenomenon in terms of GNP or industrial net production ignoring additional information in other economic variables. Multivariate classification overcomes...
Persistent link: https://www.econbiz.de/10010316572