An Effective Manifold-based Optimization Method for Distributionally Robust Classification

Authors: Jiawei Huang, Hu Ding

ICLR 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We conduct a set of experiments on several popular benchmark datasets, where the results demonstrate our advantages in terms of accuracy and robustness. (...) We conduct a series of experiments across several scenarios to evaluate our proposed method. Specifically, we test it on three typical distribution shift tasks: dealing with noisy data, attacked data and imbalanced data. All experiments are implemented with Py Torch on a single NVIDIA RTX 6000 Ada.
Researcher Affiliation Academia Jiawei Huang1,2, Hu Ding1 1, School of Computer Science and Technology, University of Science and Technology of China 2, Department of Computer Science, City University of Hong Kong EMAIL, EMAIL
Pseudocode Yes Algorithm 1 Distributional Robust in Data Manifold M (...) Algorithm 2 Evolve(Ξ, g( ), ν)
Open Source Code No The paper does not explicitly provide a link to source code, nor does it state that code is available in supplementary materials or will be released.
Open Datasets Yes We use CIFAR-10/100 (Krizhevsky et al., 2012), Tiny Image Net-200 (Krizhevsky et al., 2017), and a medical imaging dataset Bu S (Mo et al., 2023) for image classification. (...) Figure 10 illustrates the perturbation trajectories on the MNIST dataset.
Dataset Splits No The paper mentions evaluating on long-tailed benchmark datasets CIFAR-10-LT and CIFAR-100-LT with imbalance factors of 10, 50, and 100 for training. It also refers to a validation set for early stopping. However, specific percentages or sample counts for general training/validation/test splits are not explicitly provided for all datasets mentioned, nor are citations to predefined splits for these datasets given in detail.
Hardware Specification Yes All experiments are implemented with Py Torch on a single NVIDIA RTX 6000 Ada.
Software Dependencies No The paper mentions 'implemented with Py Torch' and refers to 'Py Torch (Paszke et al., 2019)' but does not provide specific version numbers for PyTorch or any other software libraries used.
Experiment Setup Yes In our experimental setup, we set the hyperparameters λ1 = 0.01 and λ2 = 1 in Eq.(8) and Eq.(12) respectively to configure the algorithm, unless otherwise specified. (...) We set ν0 = 0.5 across all experiments. (...) We train each method with 200 epochs.