secretaire-inma@uclouvain.be +32 10 47 80 36
Home > Publications > Positive Semi-definite Embedding for Dimensionalit...
2022 • Journal Article

Positive Semi-definite Embedding for Dimensionality Reduction and Out-of-Sample Extensions

Authors:
Michaël Fanuel (UCL), Aspeel, Antoine, Delvenne, Jean-Charles , Johan Suykens (KU Leuven, ESAT)
Published in:
SIAM Journal on Mathematics of Data Science

Volume: 4 • Number: 1 • Pages: 153-178

In machine learning or statistics, it is often desirable to reduce the dimensionality of a sample of data points in a high dimensional space Rd. This paper introduces a dimensionality reduction method where the embedding coordinates are the eigenvectors of a positive semi-definite kernel obtained as the solution of an infinite dimensional analogue of a semi-definite program. This embedding is adaptive and non-linear. We discuss this problem both with weak and strong smoothness assumptions about the learned kernel. A main feature of our approach is the existence of an out-of-sample extension formula of the embedding coordinates in both cases. This extrapolation formula yields an extension of the kernel matrix to a data-dependent Mercer kernel function. Our empirical results indicate that this embedding method is more robust with respect to the influence of outliers compared with a spectral embedding method.

Related Resources