PCA-kernel estimation
-
Gérard Biau
Abstract
Many statistical estimation techniques for high-dimensional or functional data are based on a preliminary dimension reduction step, which consists in projecting the sample X1,...,Xn onto the first D eigenvectors of the Principal Component Analysis (PCA) associated with the empirical projector ^ ΠD. Classical nonparametric inference methods such as kernel density estimation or kernel regression analysis are then performed in the (usually small) D-dimensional space. However, the mathematical analysis of this data-driven dimension reduction scheme raises technical problems, due to the fact that the random variables of the projected sample (^ΠDX1,...,^ΠDXn) are no more independent. As a reference for further studies, we offer in this paper several results showing the asymptotic equivalencies between important kernel-related quantities based on the empirical projector and its theoretical counterpart. As an illustration, we provide an in-depth analysis of the nonparametric kernel regression case.
© by Oldenbourg Wissenschaftsverlag, Montpellier Cedex 5, Germany
Articles in the same Issue
- Conditional risk and acceptability mappings as Banach-lattice valued mappings
- PCA-kernel estimation
- Some multivariate risk indicators: Minimization by using a Kiefer–Wolfowitz approach to the mirror stochastic algorithm
- Ordering of multivariate risk models with respect to extreme portfolio losses
Articles in the same Issue
- Conditional risk and acceptability mappings as Banach-lattice valued mappings
- PCA-kernel estimation
- Some multivariate risk indicators: Minimization by using a Kiefer–Wolfowitz approach to the mirror stochastic algorithm
- Ordering of multivariate risk models with respect to extreme portfolio losses