Learning Using Anti-Training with Sacrificial Data

Authors: Michael L. Valenzuela, Jerzy W. Rozenblit

JMLR 2016 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental An extensive case study is presented along with simulated annealing results to demonstrate the efficacy of the ATSD method. We extensively discuss several experiments in Section 5 and the results in Section 6.
Researcher Affiliation Academia Michael L. Valenzuela EMAIL Jerzy W. Rozenblit EMAIL Electrical and Computer Engineering Department University of Arizona 1230 E. Speedway Blvd. Tucson, AZ 85721, UNITED STATES
Pseudocode Yes Algorithm 1 Anti-Training Combined with Meta-Learning
Open Source Code Yes All 3.1 GB of the data is available, but we also offer just the Matlab scripts (only 100KB) that can generate statistically similar data.
Open Datasets Yes This data, referred to as the adult"" data, is made publicly available and can be found in the UCI Machine Learning Repository (Bache and Lichman, 2013).
Dataset Splits Yes We partition the \tta \ttd \ttu \ttl \ttt \ttd \tta \ttt \tta .\ttc \tts \ttv data into 1/4 for training and 3/4 for validation for each f \in \scr F +.
Hardware Specification Yes We ran the experiments using a cluster of three standard-grade (as of the year 2008) computers. One computer is a Gateway E6610Q model, using the Intel QX6700 quadcore processor. Two other computers are custom built, one using the Intel Q6600 quad-core processor and the third using the Intel i7-3770K quad-core processor. All computers have 4GB of RAM (DDR2 800, DDR3 1033, DDR3 1600).
Software Dependencies No Two of the three computers run Matlab on Windows 7. The third computer runs Matlab in Ubuntu. ... Since the core kernel of the simulation is already optimized and Matlab's ODE45 was still the bottleneck (taking over 98% of the time), we copied the ODE45 code and removed all the code for options. The paper mentions software (Matlab, Windows 7, Ubuntu, ODE45) but does not provide specific version numbers for these, which are required for a reproducible description of ancillary software.
Experiment Setup Yes The hyper-parameters we investigate are: step length function (two choices), initial temperatures (four real numbers), cooling function (three choices), reannealing time (one integer), upper bounds (four real numbers), and lower bounds (four real numbers). Specifically, the SVM's kernel, box constraint (training error versus complexity), and kernel parameters (e.g. Gaussian kernel width).