A Study of the Classification of Low-Dimensional Data with Supervised Manifold Learning

Authors: Elif Vural, Christine Guillemot

JMLR 2017 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental The proposed analysis is supported by experiments on several real data sets. Keywords: Manifold learning, dimensionality reduction, classification, out-of-sample extensions, RBF interpolation. In Section 4, we evaluate our results with experiments on several real data sets.
Researcher Affiliation Academia Elif Vural EMAIL Department of Electrical and Electronics Engineering Middle East Technical University Ankara, 06800, Turkey. Christine Guillemot EMAIL Centre de Recherche INRIA Bretagne Atlantique Campus Universitaire de Beaulieu 35042 Rennes, France
Pseudocode No The paper does not contain any clearly labeled pseudocode or algorithm blocks. It provides mathematical formulations and proofs in the appendices, but no structured algorithm steps.
Open Source Code No The paper does not provide any explicit statement about releasing source code for the described methodology, nor does it include links to any code repositories.
Open Datasets Yes We evaluate the theoretical results of Section 2 on several real data sets: the COIL-20 object database (Nene et al., 1996), the Yale face database (Georghiades et al., 2001), the ETH-80 object database (Leibe and Schiele, 2003), and the MNIST handwritten digit database (Le Cun et al., 1998).
Dataset Splits Yes The plots in Figure 5 show the variation of the misclassification rate of test samples in percentage with the ratio of the number of training samples in the whole data set. The results are the average of 5 repetitions of the experiment with different random choices for the training and test samples.
Hardware Specification No The paper does not explicitly describe any specific hardware (e.g., GPU models, CPU types, memory) used for running the experiments. It mentions using 'pretrained Alex Net convolutional neural network' but not the hardware it ran on.
Software Dependencies No The paper mentions the 'Alex Net convolutional neural network proposed in (Krizhevsky et al., 2012)' as a component used for feature extraction, but it does not specify version numbers for any software dependencies or libraries used for their implementation.
Experiment Setup Yes The number of neighbors is set as K = 1 for the K-NN algorithm in these experiments... the scale parameter of the kernel regression algorithm is optimized to get the best accuracy... The scale parameter σ of the RBF kernel is set to a reference value in each data set within the typical range [0.5, 1]... We have fixed the weight parameter as µ = 0.01 in all setups, and set the dimension of the embedding as equal to the number of classes.