Continual Learning with Filter Atom Swapping

Authors: Zichen Miao, Ze Wang, Wei Chen, Qiang Qiu

ICLR 2022 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Being validated on multiple benchmark datasets with different convolutional network structures, the proposed method outperforms the state-of-the-art methods in both accuracy and scalability.
Researcher Affiliation Academia Zichen Miao, Ze Wang, Wei Chen, Qiang Qiu Department of ECE Purdue University EMAIL
Pseudocode Yes We provide the algorithm of the proposed method in Alg. 1.
Open Source Code No The paper does not provide explicit statements or links for open-source code availability.
Open Datasets Yes We validate our method with the Class-Incremental (CI) setting with CIFAR100 and Image Net-Subset , which contains 100 classes selected from Image Net (with random seed 1993).
Dataset Splits Yes Details of each dataset are provided in the Appendix C. Table A: Statistics of 10-Split CIFAR100. # Training samples/task 4500 # Validation samples/task 500 # Test samples/task 100
Hardware Specification Yes All methods are tested on a single RTX 2080ti GPU under the class-incremental setting.
Software Dependencies No The paper mentions software like SGD, ResNet, and AlexNet-like architectures but does not specify their version numbers.
Experiment Setup Yes For CIFAR100, we choose SGD with batch-size of 128, learning rate of 0.01, momentum of 0.9 and weight decay 1e-3. The model is trained for 250 epochs, with learning rate drop by 0.1 at the 100-th and 200-th epoch.