An Intuitive Introduction to Laplacian Eigenmaps for Dimensionality Reduction

By Weijie Zhang and Kaiwen Bian

June, 2024

Introduction to Dimension Reudction

The images may take a while to load...

Here we have 300 artworks images from the collection at the Metropolitan Museum of Art (MET) to illustrate the power of dimension reduction algorithm (i.e. Principal Component Analysis, Spectral Embeddings, ...). Each of the images in this dataset now lives in very high dimension (128 width x 128 height x 3 color channels) and for visualization purposes, they are currently all positioned randomly on a 2D base.

Are there any "principals" in these 49,152 length vectors (flattening image) that can give us a better understanding for these images maybe for a later task such as classification? This question have always been a important one to ask because high dimensional data are very noisy and simple classfiers trained on high dimensional data aren't very robust and accurate in their predictions.

Modern representation learning approaches lend us a way of looking at this problem, particularly for this visualization project, we are adapting an spectral embedding dimension reduction technique knwon as Laplacien Eigenmap, which reduces the dimensionality of the dataset while preserving the relationships between each data points. We will explain more in the following sections.