FuseUNet: A Multi-Scale Feature Fusion Method for U-like Networks

Authors: Quansong He, Xiangde Min, Kaishen Wang, Tao He

ICML 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experiments on ACDC, Ki TS2023, MSD brain tumor, and ISIC2017/2018 skin lesion segmentation datasets demonstrate improved feature utilization, reduced network parameters, and maintained high performance. ... Table 3 presents the performance of the proposed method on 3D tasks. ... Table 4 shows the performance of the proposed method on 2D tasks... 4.3. Ablation Experiments. The impact of the number of feature fusion steps. ... The impact of memory capacity.
Researcher Affiliation Academia 1College of Computer Science, Sichuan University, Chengdu, China 2Tongji Hospital Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China 3Department of Computer Science, University of Maryland, College Park, America. Correspondence to: Tao He <tao EMAIL>.
Pseudocode Yes Algorithm 1 Adaptive discrete method for U-Nets ... This algorithm is outlined in Algorithm 1, and the detailed computational process is provided in Appendix A. ... Table 6. Workflow and Results. Take a network with 6 stages as an example for demonstration. P, C, Cal, F stand for Predictor, Corrector, Calculator, nm ODEs block, respectively.
Open Source Code Yes The code is available at https: //github.com/nayutayuki/Fuse UNet.
Open Datasets Yes Table 2 presents the details of the datasets used in this paper. ... ACDC (Bernard et al., 2018) ... Ki TS23 (Heller et al., 2023) ... MSD (Antonelli et al., 2022) ... ISIC2017 (Codella et al., 2018a) ... ISIC2018 (Codella et al., 2018b)
Dataset Splits Yes All 3D tasks report Dice scores (%) using fivefold cross-validation, following the protocol of the backbone network.
Hardware Specification Yes All experiments were conducted on a single RTX 4090.
Software Dependencies No The paper mentions "MONAI framework" for UNETR but does not provide specific version numbers for software dependencies.
Experiment Setup Yes Except for the learning rate, all experimental settings were consistent with those of the backbone networks used for comparison. Due to the significant reduction in the number of parameters, the learning rate was set to 2 or 3 times that of the backbone network s setting. The detailed hyperparameter settings are provided in Appendix B.