Neural Representational Consistency Emerges from Probabilistic Neural-Behavioral Representation Alignment
Authors: Yu Zhu, Chunfeng Song, Wanli Ouyang, Shan Yu, Tiejun Huang
ICML 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We first validated PNBA s alignment capability using neural recordings from motor and sensory cortices across different species (Safaie et al., 2023; Turishcheva et al., 2024). Quantitative analyses demonstrate that PNBA achieves superior neuralbehavioral representation alignment across multiple cortical areas and subjects (Table 1). Ablation Studies. We conducted systematic analyses to evaluate the architectural design of PNBA. Our systematic analysis reveals a hierarchical preservation of neural representations in motor cortex (Fig. 5). Zero-shot validation in mouse V1 calcium imaging data revealed that neural representation preservation extends beyond motor cortex. Zero-shot V1-guided Movement Decoding |
| Researcher Affiliation | Collaboration | 1Institute of Automation, Chinese Academy of Sciences 2School of Artificial Intelligence, University of Chinese Academy of Sciences 3Shanghai Artificial Intelligence Laboratory 4Beijing Academy of Artificial Intelligence. Correspondence to: Chunfeng Song <EMAIL>, Shan Yu <EMAIL>. |
| Pseudocode | No | The paper includes mathematical equations and derivations, but no explicitly labeled pseudocode blocks or algorithms are present. |
| Open Source Code | Yes | Codes are availiable at https://github.com/ zhuyu-cs/PNBA. |
| Open Datasets | Yes | The primary dataset consists of neural recordings from non-human primates during center-out reaching tasks (Safaie et al., 2023). To examine the framework s applicability beyond motor cortices, we conducted analyses on calcium imaging data from mouse V1 (Turishcheva et al., 2024). The V1 dataset contains recordings from 10 mice presented with identical dynamic visual stimuli within paired experiments (5 pairs total), provided by the Sensorium 2023 Competition. |
| Dataset Splits | Yes | For M1 evaluation, we utilized neural data from 12 training sessions per subject for two subjects (C, M), two validation sessions per subject, and three test sessions each from two held-out subjects (J, T). We used 8 mice (4 pairs) for training and validation, with 2 mice (1 pair) for held-out testing to assess cross-subject generalization. As shown in Table 4, we employed an 8/2 train-validation split, with a key experimental design feature: each mouse was paired with another subject that viewed identical video sequences. |
| Hardware Specification | Yes | GPU Configuration 1 NVIDIA A100 (40GB) for Monkey Motor Cortex (M1 and PMd) and 4 NVIDIA A100 (40GB) for Mouse Visual Cortex (V1) (Table 5). |
| Software Dependencies | No | The paper mentions optimizers like "Adam W" and model architectures like "UNet" but does not specify version numbers for programming languages, libraries, or frameworks (e.g., Python, PyTorch, TensorFlow, CUDA). |
| Experiment Setup | Yes | Optimization Settings Optimizer Adam W (β1 = 0.9, β2 = 0.999) Weight Decay 0.05 Batch Configuration 32 samples, 16 timebins per batch Initial Learning Rate 1 10 4 Warm-up Period First 600 iterations Peak LR Duration 50 epochs (M1/PMd) or 200 epochs (V1) Decay Strategy Cosine annealing to 1 10 7 (last 25 epochs) or (last 200 epochs) Total Epochs 100 (M1/PMd) or 400 (V1). (Table 5) Network Architecture Specifications (Table 6) also provide details such as Base Channels (ch) 32 64, Resolution Levels 4, Residual Blocks per Level 2, Attention Resolutions [2,16], Dropout Rate 0. |