Improved Approximation Algorithms for $k$-Submodular Maximization via Multilinear Extension

Authors: Huanjian Zhou, Lingxiao Huang, Baoxiang Wang

ICLR 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We investigate a generalized form of submodular maximization, referred to as k-submodular maximization, with applications across the domains of social networks and machine learning. In this work, we propose the multilinear extension of k-submodular functions and unified Frank-Wolfe-type frameworks based on that. This continuous framework accommodates 1) monotone or non-monotone functions, and 2) various constraint types including matroid constraints, knapsack constraints, and their combinations. Notably, we attain an asymptotically optimal 1/2-approximation for monotone k-submodular maximization problems with knapsack constraints, surpassing previous 1/3-approximation results (Ha et al., 2024), and a factor-1/3 approximation for non-monotone k-submodular maximization problems with knapsack constraints and matroid constraints which outperforms previous 0.245-approximation results (Yu et al., 2023). The foundation for our analysis stems from new insights into specific linear and monotone properties pertaining to the multilinear extension.
Researcher Affiliation Academia Huanjian Zhou1,2 Lingxiao Huang3 Baoxiang Wang4 1The University of Tokyo 2 RIKEN AIP 3State Key Laboratory of Novel Software Technology, New Cornerstone Science Laboratory, Nanjing University 4The Chinese University of Hong Kong Shenzhen
Pseudocode Yes Algorithm 1: Frank-Wolfe algorithm for the monotone case, FW(f, P, ε, η) ... Algorithm 3: Frank-Wolfe algorithm for non-monotone case, NFW(f, P, ε, η)
Open Source Code No The paper does not contain any explicit statements or links indicating that source code for the methodology is provided.
Open Datasets No The paper discusses various applications of k-submodular maximization, such as identifying influential individuals in social networks, partitioning features for regression, and selecting sensors. However, it does not refer to any specific datasets that are used for empirical evaluation or that are made publicly available.
Dataset Splits No This is a theoretical paper focusing on approximation algorithms and their analysis. It does not involve empirical experiments with datasets, and therefore, no dataset splits are provided.
Hardware Specification No This is a theoretical paper presenting algorithms and approximation guarantees. It does not describe any experimental setup or mention specific hardware used for computations.
Software Dependencies No This is a theoretical paper. It introduces algorithms and analyzes their approximation ratios but does not mention specific software, libraries, or versions that would be used to implement or run these algorithms.
Experiment Setup No This is a theoretical paper focused on algorithm design and analysis. It does not describe any experimental setup, including hyperparameters, training configurations, or system-level settings.