MDFG: Multi-Dimensional Fine-Grained Modeling for Fatigue Detection

Authors: Mei Wang, Xiaojie Zhu, Ruimin Hu, Dongliang Zhu, Liang Liao, Mang Ye

AAAI 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental MDFG achieves an Average accuracy improvement of 10.0% and 12.1% on two real datasets compared to methods that do not consider fine-grained information. Extensive experiments demonstrate that the MDFG exhibits superior robustness and stability among current fatigue detection methods.
Researcher Affiliation Academia 1National Engineering Research Center for Multimedia Software, School of Computer Science, Wuhan University 2Hubei Key Laboratory of Multimedia and Network Communication Engineering,Wuhan University 3School of Cyber Science and Engineering, Wuhan University, Whuhan, China 4Cyberspace Security Laboratory, School of Network and Information Security, Xidian University, Xi an,China 5The School of Computer Science and Engineering, Nanyang Technological University, Singapore EMAIL,EMAIL,EMAIL
Pseudocode No The paper describes the methodology in prose and mathematical equations but does not include any structured pseudocode or algorithm blocks.
Open Source Code Yes The code and checkpoints are released at https://github.com/Mei Wang003/MDFG.
Open Datasets Yes This paper conducts experiments on the re RLDD and re DROZY datasets (Wang et al. 2024a)
Dataset Splits Yes Both datasets are trained using data from the first eight individuals, while the remaining is for testing.
Hardware Specification Yes The experiment is implemented using Py Torch 1.13.1 on two 3090 GPUs.
Software Dependencies Yes The experiment is implemented using Py Torch 1.13.1 on two 3090 GPUs.
Experiment Setup Yes The ϕF network is trained with Adam, using a weight decay of 1e-3, a learning rate of 0.001, and for 200 epochs. In the experiment, each instance is 30 seconds, with a 1-second sliding window and 0.5-second step size, all sampled at the video s frame rate. [...] Dropout (0.5).