Generalizable Sensor-Based Activity Recognition via Categorical Concept Invariant Learning

Authors: Di Xiong, Shuoyuan Wang, Lei Zhang, Wenbo Huang, Chaolei Han

AAAI 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments on four public HAR benchmarks demonstrate that our CCIL substantially outperforms the state-of-the-art approaches under cross-person, cross-dataset, cross-position, and one-person-to-another settings.
Researcher Affiliation Academia 1Nanjing Normal University, Nanjing 210023, Jiangsu, China 2Southern University of Science and Technology, Shenzhen 518055, Guangdong, China 3Southeast University, Nanjing 211189, Jiangsu, China EMAIL, EMAIL, EMAIL
Pseudocode No The paper describes the proposed method and learning objective using mathematical formulas (Eq. 2-6) and textual descriptions in sections like 'Categorical Concept Invariant Learning' and 'Learning Objective', but it does not include a clearly labeled pseudocode block or algorithm.
Open Source Code No The paper does not contain any explicit statements about releasing source code, nor does it provide links to a code repository.
Open Datasets Yes We evaluate our method on four public sensor-based HAR benchmark: DSADS (Altun, Barshan, and Tunc el 2010), PAMAP2 (Reiss and Stricker 2012), USC-HAD (Zhang and Sawchuk 2012) and UCI-HAR (Anguita et al. 2013).
Dataset Splits Yes In the DSADS dataset, there are a total of 8 subjects. We divide the 8 subjects into 4 domains, each of which contains two subjects. We use the sliding window technique with a window size of 125 and an overlap rate of 50%. In the USCHAD dataset, there are a total of 14 subjects. We roughly divide them into four domains, where three of four domains with each containing four subjects are used as source domain, while the rest domain containing two subjects is utilized as target domain. We use the sliding window technique with a window size of 200 and an overlap rate of 50%. In the PAMAP2 dataset, there are a total of 9 subjects with subject IDs 0 8. We divide them into four domains: (2, 3, 8), (1, 5), (0, 7), (4, 6). We use the sliding window technique with a window size of 200 and an overlap rate of 50%. Following the generalization setup of HAR in DIVERSIFY (Lu et al. 2024), we employed a source-domain validation strategy. The source domain data was split into training and validation sets with a ratio of 8:2.
Hardware Specification Yes The experiments were conducted on a server equipped with a Ge Force 3090 GPU.
Software Dependencies No The paper mentions using an "Adam optimizer" but does not specify its version or any other software dependencies with version numbers.
Experiment Setup Yes The maximum training period was set to 150 epochs and an Adam optimizer with a weight decay of 5 10 4 was used. All methods utilized a learning rate of 10 2 or 10 3. In all experiments, the batch size was set to 32.