Online and Streaming Algorithms for Constrained k-Submodular Maximization
Authors: Fabian Christian Spaeh, Alina Ene, Huy Nguyen
AAAI 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We experimentally evaluate our algorithms on instances for ad allocation and other applications, where we observe that our algorithms are practical and scalable, and construct solutions that are comparable in value even to offline greedy algorithms. |
| Researcher Affiliation | Academia | 1Department of Computer Science, Boston University 2Khoury College of Computer Science, Northeastern University EMAIL, EMAIL, EMAIL |
| Pseudocode | Yes | Algorithm 1 Monotone k-submodular maximization. Parameters: coefficients {ga(i)}a [k],i [na] Input: a monotone k-submodular function f and the budgets {na}a [k] S = (S1, . . . , Sk) ( , . . . , ) βa 0 for all a [k] for t = 1, 2, . . . , |V |: let wt,a = t,af (S) for all a [k] let a = arg maxa [k] {wt,a βa} if wt,a βa 0: if |Sa| < na: Sa Sa {t} else: let t = arg mini Sa wi,a Sa (Sa \ {t }) {t} let wa(i) be the i-th largest weight in {wt,a : t Sa} and wa(i) = 0 for i > |Sa| βa Pna i=1 wa(i)ga(i) return S |
| Open Source Code | No | The paper does not provide an explicit statement about open-sourcing code or a link to a code repository. |
| Open Datasets | Yes | We use data from the i Pin You ad exchange (Zhang, Yuan, and Wang 2014) and a Yahoo dataset (Yahoo 2011)... Our results on the Email network (Leskovec and Krevl 2014) are in Figure 2. |
| Dataset Splits | No | The paper discusses varying uniform budgets and reporting mean and standard deviation over days or runs, but does not provide specific information on training/test/validation dataset splits, such as percentages, sample counts, or detailed splitting methodology. |
| Hardware Specification | Yes | Our implementation is in Python, and all experiments were run on a machine with an Intel Core i7-2600 CPU, 3.40 GHz, 16GB RAM. |
| Software Dependencies | No | The paper states "Our implementation is in Python," but does not provide specific version numbers for Python or any libraries, solvers, or specialized packages used. |
| Experiment Setup | Yes | Parameters: coefficients {ga(i)}a [k],i [na] Input: a monotone k-submodular function f and the budgets {na}a [k]... We use two parameter choices for the online algorithms: First, we set {da}a [k], {ca}a [k] to the optimal theoretical choice as the minimizer of Qa in Lemma 5. Second, we modify these parameters by reducing each ca to 1 of the the previous choice to make the algorithms less conservative. |