Efficient Structure-preserving Support Tensor Train Machine

Authors: Kirandeep Kour, Sergey Dolgov, Martin Stoll, Peter Benner

JMLR 2023 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In Section 4 we benchmark the different steps of the proposed algorithm and compare it to a variety of competing methods using two data sets each from two different fields with a limited amount of training data, which are known to be challenging for classification. Experimental Settings All numerical experiments have been done in MATLAB 2016b. Table 2: Average classification accuracy in percentage for different methods and data sets
Researcher Affiliation Academia Max Planck Institute for Dynamics of Complex Technical Systems Magdeburg, D-39106, Germany. Department of Mathematical Sciences University of Bath Bath BA2 7AY, United Kingdom. Faculty of Mathematics Technische Universität Chemnitz Chemnitz, D-09107, Germany.
Pseudocode Yes Algorithm 1: Uniqueness Enforcing TT-SVD. Algorithm 2: TT-CP approximation of the STM Kernel
Open Source Code Yes The codes are available publicly on Git Hub3. 3. https://github.com/mpimd-csc/Structure-preserving_STTM
Open Datasets Yes Alzheimer Disease (ADNI): The ADNI4 stands for Alzheimer Disease Neuroimaging Initiative. ... 4. http://adni.loni.usc.edu/. Attention Deficit Hyperactivity Disorder (ADHD): The ADHD data set is collected from the ADHD-200 global competition data set5. ... 5. http://neurobureau.projects.nitrc.org/ADHD200/Data.html. Hyperspectral Image (HSI) Datasets: We have taken the mat file for both the datasets and their corresponding labels6. ... 6. http://www.ehu.eus/ccwintco/index.php/Hyperspectral_Remote_Sensing_Scenes. The HSI images were collected via the Aviris Sensor7 over the Indian Pines test site. ... 7. https://aviris.jpl.nasa.gov/
Dataset Splits Yes Since the precise magnitude of the noise is unknown, we carry out a k-fold cross-validation test (k = 5) to find the optimal TT rank. For tuning R, σ and C to the best classification accuracy, we use the k-fold cross validation with k = 5.
Hardware Specification Yes We have run all experiments on a machine equipped with Ubuntu release 16.04.6 LTS 64-bit, 7.7 Gi B of memory, and an Intel Core i5-6600 CPU @ 3.30GHz 4 CPU.
Software Dependencies Yes All numerical experiments have been done in MATLAB 2016b. In the first step, we compute the TT format of an input tensor using the TT-Toolbox1, where we modified the function @tt_tensor/round.m to enforce the uniqueness enforcing TT-SVD (Section 3.1). Moreover, we have implemented the TT-CP conversion, together with the norm equilibration. For the training of the TT-MMK model, we have used the svmtrain function available in the LIBSVM2 library.
Experiment Setup Yes The entire TT-SVM model depends on three parameters. First, to simplify the selection of TT ranks, we take all TT ranks equal to the same value R {1, 2, . . . 10}. Another parameter is the width of the Gaussian Kernel σ. Finally, the third parameter is a trade-off constant C for the KSTM optimization technique (6). Both σ and C are chosen from {2 8, 2 7, . . . , 27, 28}. For tuning R, σ and C to the best classification accuracy, we use the k-fold cross validation with k = 5. Along with this, we repeat all computations 20 times and compute statistics (average, standard deviation, and numerical quantiles) over these runs.