Convergence of Mean-Field Langevin Stochastic Descent-Ascent for Distributional Minimax Optimization
Authors: Zhangyi Liu, Feng Liu, Rui Gao, Shuang Li
ICML 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We study convergence properties of the discrete-time Mean-Field Langevin Stochastic Descent Ascent (MFL-SDA) algorithm for solving distributional minimax optimization. ... To address this gap, we establish a last-iterate convergence rate of O( 1 /ϵ) for MFL-SDA. |
| Researcher Affiliation | Academia | 1Department of Mathematical Sciences, Tsinghua University, Beijing, China 2 School of Economics and Management, University of the Chinese Academy of Sciences, Beijing, China 3Department of Information, Risk, and Operations Management, University of Texas at Austin, Austin, USA 4School of Data Science, The Chinese University of Hong Kong (Shenzhen), Shenzhen, China. |
| Pseudocode | Yes | Algorithm 1 Mean field Langevin Stochastic Descent Ascent (MFL-SDA) |
| Open Source Code | No | The paper does not provide any explicit statements about open-source code availability, specific repository links, or mentions of code in supplementary materials for the described methodology. |
| Open Datasets | No | The paper discusses applications like zero-sum games, generative adversarial networks (GANs), and mean-field neural networks as examples for its theoretical framework, but it does not present any empirical studies using specific datasets or provide access information for any open datasets. |
| Dataset Splits | No | The paper is theoretical and does not describe any experiments that would involve datasets, and therefore, no dataset splits are mentioned. |
| Hardware Specification | No | The paper is theoretical and focuses on convergence analysis. It does not describe any computational experiments or their hardware requirements. |
| Software Dependencies | No | The paper focuses on theoretical analysis and does not mention any specific software, libraries, or programming languages with version numbers required for reproducibility. |
| Experiment Setup | No | The paper is theoretical and does not describe any experimental setup details, hyperparameters, or training configurations. |