A Convex Parametrization of a New Class of Universal Kernel Functions
Authors: Brendon K. Colbert, Matthew M. Peet
JMLR 2020 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Numerical testing on soft margin Support Vector Machine (SVM) problems show that algorithms using TK kernels outperform other kernel learning algorithms and neural networks. Furthermore, our results show that when the ratio of the number of training data to features is high, the improvement of TK over MKL increases significantly. |
| Researcher Affiliation | Academia | Brendon K. Colbert EMAIL Department of Mechanical and Aerospace Engineering Arizona State University Tempe, AZ 85281-4322, USA |
| Pseudocode | No | The paper describes mathematical formulations and optimization problems, such as Optimization Problem (24) and (25), but does not present any structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not contain any explicit statements about the release of source code or links to a code repository for the methodology described. |
| Open Datasets | Yes | To evaluate the accuracy, we applied 5 variations of the kernel learning problem to 5 randomly selected benchmark data sets from the UCI Machine learning Data Repository Liver, Cancer, Heart, Pima, and Ionosphere. |
| Dataset Splits | Yes | In all evaluations of Test Set Accuracy (TSA), the data is partitioned into 80% training data and 20% testing and this partition is repeated 30 times to obtain 30 sets of training and testing data. For all numerical tests we use the soft-margin problem with regularization parameter C, where C is selected from a set of values picked a priori by 5-fold cross-validation. To perform 5-fold cross-validation we split the training data set into five groups, solve the optimization problem using each potential value of C on four of the five groups and test the optimal classifier performance on the remaining group. |
| Hardware Specification | No | The paper mentions using MATLABs (patternnet) implementation for neural networks, but does not provide any specific details about the hardware used for running experiments. |
| Software Dependencies | Yes | This problem can now be solved using well-developed interior-point methods as in Alizadeh et al. (1998) with implementations such as MOSEK in Ap S (2015). |
| Experiment Setup | Yes | For all numerical tests we use the soft-margin problem with regularization parameter C, where C is selected from a set of values picked a priori by 5-fold cross-validation. To determine the integral in (24), we first scaled the data so that xi [0, 1]n, and then set X := [0 ϵ, 1 + ϵ]n, where ϵ > 0 was chosen by 5-fold cross-validation. We use 3 layer neural network with 50 hidden layers using MATLABs (patternnet) implementation. |