TY - GEN
T1 - A unified framework for probabilistic component analysis
AU - Nicolaou, Mihalis A.
AU - Zafeiriou, Stefanos
AU - Pantic, Maja
N1 - eemcs-eprint-25812
PY - 2014/9
Y1 - 2014/9
N2 - We present a unifying framework which reduces the construction of probabilistic component analysis techniques to a mere selection of the latent neighbourhood, thus providing an elegant and principled framework for creating novel component analysis models as well as constructing probabilistic equivalents of deterministic component analysis methods. Under our framework, we unify many very popular and well-studied component analysis algorithms, such as Principal Component Analysis (PCA), Linear Discriminant Analysis (LDA), Locality Preserving Projections (LPP) and Slow Feature Analysis (SFA), some of which have no probabilistic equivalents in literature thus far. We firstly define the Markov Random Fields (MRFs) which encapsulate the latent connectivity of the aforementioned component analysis techniques; subsequently, we show that the projection directions produced by all PCA, LDA, LPP and SFA are also produced by the Maximum Likelihood (ML) solution of a single joint probability density function, composed by selecting one of the defined MRF priors while utilising a simple observation model. Furthermore, we propose novel Expectation Maximization (EM) algorithms, exploiting the proposed joint PDF, while we generalize the proposed methodologies to arbitrary connectivities via parametrizable MRF products. Theoretical analysis and experiments on both simulated and real world data show the usefulness of the proposed framework, by deriving methods which well outperform state-of-the-art equivalents.
AB - We present a unifying framework which reduces the construction of probabilistic component analysis techniques to a mere selection of the latent neighbourhood, thus providing an elegant and principled framework for creating novel component analysis models as well as constructing probabilistic equivalents of deterministic component analysis methods. Under our framework, we unify many very popular and well-studied component analysis algorithms, such as Principal Component Analysis (PCA), Linear Discriminant Analysis (LDA), Locality Preserving Projections (LPP) and Slow Feature Analysis (SFA), some of which have no probabilistic equivalents in literature thus far. We firstly define the Markov Random Fields (MRFs) which encapsulate the latent connectivity of the aforementioned component analysis techniques; subsequently, we show that the projection directions produced by all PCA, LDA, LPP and SFA are also produced by the Maximum Likelihood (ML) solution of a single joint probability density function, composed by selecting one of the defined MRF priors while utilising a simple observation model. Furthermore, we propose novel Expectation Maximization (EM) algorithms, exploiting the proposed joint PDF, while we generalize the proposed methodologies to arbitrary connectivities via parametrizable MRF products. Theoretical analysis and experiments on both simulated and real world data show the usefulness of the proposed framework, by deriving methods which well outperform state-of-the-art equivalents.
KW - HMI-HF: Human Factors
KW - EC Grant Agreement nr.: FP7/288235
KW - EWI-25812
KW - EC Grant Agreement nr.: FP7/2007-2013
KW - Dimensionality Reduction
KW - METIS-309931
KW - Random Fields
KW - Probabilistic Methods
KW - IR-94679
KW - Component Analysis
KW - Unifying Framework
U2 - 10.1007/978-3-662-44851-9_30
DO - 10.1007/978-3-662-44851-9_30
M3 - Conference contribution
SN - 978-3-662-44850-2
T3 - Lecture Notes in Computer Science
SP - 469
EP - 484
BT - Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases, ECML PKDD 2014
A2 - Calders, Toon
A2 - Esposito, Floriana
A2 - Hüllermeier, Eyke
A2 - Meo, Rosa
PB - Springer
CY - Berlin
T2 - European Conference on Machine Learning and Knowledge Discovery in Databases, ECML PKDD 2014
Y2 - 15 September 2014 through 19 September 2014
ER -