Ramp Loss Linear Programming Support Vector Machine

Authors: Xiaolin Huang, Lei Shi, Johan A.K. Suykens

JMLR 2014 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In the numerical experiments, we evaluate the performance of ramp-LPSVM (6) and its problem-solving algorithms. We first report the optimization performance and then discuss the robustness and the sparsity compared with C-SVM, LPSVM (5), and ramp-SVM (11).
Researcher Affiliation Academia Xiaolin Huang EMAIL Department of Electrical Engineering, ESAT-STADIUS, KU Leuven Kasteelpark Arenberg 10, Leuven, B-3001, Belgium Lei Shi EMAIL Department of Electrical Engineering, ESAT-STADIUS, KU Leuven School of Mathematical Sciences, Fudan University, Shanghai, 200433, P.R. China Johan A.K. Suykens EMAIL Department of Electrical Engineering, ESAT-STADIUS, KU Leuven Kasteelpark Arenberg 10, Leuven, B-3001, Belgium
Pseudocode Yes Algorithm 1: DC programming for ramp-LPSVM from ˆα,ˆb [...] Algorithm 2: Global Search for ramp-LPSVM
Open Source Code No The paper mentions using 'Matlab R2011a' and a 'Genetic Algorithm (GA) toolbox' developed by Chipperfield et al. (1994) for experiments, but it does not provide any statement or link to open-source code for their own proposed methodology (ramp-LPSVM).
Open Datasets Yes The data are downloaded from the UCI Machine Learning Repository given by Frank and Asuncion (2010).
Dataset Splits Yes In data sets Spect , Monk1 , Monk2 , and Monk3 , the training and the testing sets are provided. For the others, we randomly partition the data into two parts: half data are used for training and the remaining data are for testing.
Hardware Specification Yes The experiments are done in Matlab R2011a in Core 2-2.83 GHz, 2.96G RAM.
Software Dependencies Yes The experiments are done in Matlab R2011a
Experiment Setup Yes In our experiments, we apply a Gaussian kernel K(xi, xj) = exp xi xj 2/σ2 . The training data are normalized to [0, 1]n and then the regularization coefficient µ and the kernel parameter σ are tuned by 10-fold cross-validation for each method. In the tuning phase, grid search using logarithmic scale is applied. The range of possible µ value is [10 2, 103] and the range of σ value is between 10 3 and 102. For ramp-LPSVM, since the global search needs more computation time, the parameters tuning by cross-validation is conducted based on Algorithm 1. [...] Set δ (the threshold of convergence for DC programming), ε (the difference value in hill detouring), Kstep (the maximal number of hill detouring steps).