Identifying Neural Dynamics Using Interventional State Space Models
Authors: Amin Nejatbakhsh, Yixin Wang
ICML 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In simulations of the motor cortex, we show that i SSM can recover the true latents and the underlying dynamics. In addition, we illustrate two applications of i SSM in biological datasets. First, we applied i SSM to a dataset of calcium recordings from ALM neurons in mice during photostimulation. Second, we applied i SSM to a dataset of electrophysiological recordings from macaque dl PFC during micro-stimulation. In both cases, we show that i SSM outperforms SSM and results in identifiable parameters. |
| Researcher Affiliation | Collaboration | 1Center for Computation Neuroscience, Flatiron Institute, New York, US 2University of Michigan, Michigan, US. Correspondence to: Amin Nejatbakhsh <EMAIL>. |
| Pseudocode | No | The paper includes mathematical equations for the models and inference, but it does not contain any clearly labeled pseudocode or algorithm blocks. |
| Open Source Code | Yes | The code is available at https: //github.com/amin-nejat/issm. |
| Open Datasets | Yes | We applied i SSM to a public dataset of targeted photostimulation in the anterior lateral motor cortex (ALM) of mice during a short-term memory task (Daie et al., 2021). The dataset consisted of electrophysiological recordings using electrode arrays implanted on the prefrontal cortex of macaque monkeys during quiet wakefulness (resting) while the animals were sitting awake in the dark. The electrode array included 96 electrodes that were also used for delivering micro-circuit electrical stimulations (Nejatbakhsh et al., 2023). |
| Dataset Splits | Yes | For the primate dataset, the training data is the first half of each session and the test data is the second half. For the mice dataset the training data is the lick right trials while the test data is lick left trials. |
| Hardware Specification | No | The paper does not provide specific hardware details such as GPU/CPU models or memory used for running the experiments. |
| Software Dependencies | No | The paper describes the inference methodology and models used but does not specify any software libraries or frameworks with their version numbers. |
| Experiment Setup | Yes | Table B.2 contains the parameters for initializing our generative process as well as inference and optimization parameters. The Emission model is a fully connected (FC) neural network with H = 100 hidden units. For Poisson observation model, we include an additional softplus transformation to map the emission outputs to positive values. ... The optimization parameters include the learning rate denoted by Optim LR, number of iterations denoted by Optim (Iter), and the number of hidden units for the LSTM that parameterizes the mean and variance of the variational posterior denoted by LSTM (H). Furthermore, we include the initialization values in table B.2. The initialization parameters include the noise covarinace of the dynamics initial step denoted by Init (x0 noise); the covariance of the LDS denoted by Init (LDS σ); the matrices A, B; and the covariance scaling of the likelihood model referred to as Init (LL σ). |