Abstract
The increase of the dimensionality of data sets often lead to problems during estimation, which are denoted as the curse of dimensionality. One of the problems of Second Order Statistics (SOS) estimation in high dimensional data is that the resulting covariance matrices are not full rank, so their inversion, needed for example in verification systems based on the likelihood ratio, is an ill posed problem, known as the singularity problem. A classical solution to this problem is the projection of the data onto a lower dimensional subspace using Principle Component Analysis (PCA) and it is assumed that any further estimation on this dimension reduced data is free from the effects of the high dimensionality. Using theory on SOS estimation in high dimensional spaces, we show that the solution with PCA is far from optimal in verification systems if the high dimensionality is the sole source of error. For moderate dimensionality it is already outperformed by solutions based on euclidean distances and it breaks down completely if the dimensionality becomes very high.We propose a new method,the fixed point eigenwise correction, which does not have these disadvantages and performs close to optimal.
Original language | Undefined |
---|---|
Pages (from-to) | 127-139 |
Number of pages | 14 |
Journal | IEEE transactions on pattern analysis and machine intelligence |
Volume | 36 |
Issue number | 1 |
DOIs | |
Publication status | Published - Jan 2014 |
Keywords
- SCS-Safety
- variance correction
- ApplicationsApplications and Expert Knowledge-Intensive SystemsArtificial IntelligenceComputational models of visionComputer visionComputing MethodologiesFace and gesture recognitionImage Processing and Computer VisionImage RepresentationLearningMachine learningMathematics of ComputingModel Validation and AnalysisModelingModelsMultidimensionalMultivariate statisticsPattern RecognitionProbability and StatisticsSimulationVisualization
- EWI-23367
- High dimensional verification
- Eigenwise correction
- METIS-297651
- PrincipleComponent Analysis
- IR-86200
- Marˇcenko Pastur equation
- Euclidean distance
- Fixed point eigenvalue correction
- eigenvalue bias correction