Algorithm Configuration for Structured Pfaffian Settings
Authors: Maria Florina Balcan, Anh Tuan Nguyen, Dravyansh Sharma
TMLR 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | In this work, we present new frameworks for providing learning guarantees for parameterized data-driven algorithm design problems in both statistical and online learning settings. For the statistical learning setting, we introduce the Pfaffian GJ framework... For the online learning setting, we provide a new tool for verifying the dispersion property... We use our framework to provide novel learning guarantees for many challenging data-driven design problems of interest |
| Researcher Affiliation | Academia | Maria Florina Balcan EMAIL School of Computer Science Carnegie Mellon University Anh Tuan Nguyen EMAIL Machine Learning Department Carnegie Mellon University Dravyansh Sharma EMAIL Toyota Technological Institute at Chicago |
| Pseudocode | Yes | Algorithm 1 Approximate incremental quadratic algorithm for RLR with ℓ1 penalty, Rosset, 2004 ... Algorithm 2 Approximate incremental quadratic algorithm for RLR with ℓ2 penalty, Rosset, 2004 |
| Open Source Code | No | The paper does not provide concrete access to source code for the methodology described. There are no links to repositories, explicit statements about code release, or mentions of code in supplementary materials. |
| Open Datasets | No | The paper refers to 'problem instances' and 'underlying problem distribution' for its theoretical analysis but does not mention specific, named publicly available datasets or provide any access information (links, DOIs, citations) for datasets used in experiments. The paper is theoretical in nature and does not describe empirical evaluation on specific datasets. |
| Dataset Splits | No | The paper is theoretical and focuses on frameworks and guarantees for data-driven algorithm design. It does not describe empirical experiments using specific datasets, and therefore does not provide information on training/test/validation splits. |
| Hardware Specification | No | The paper is theoretical in nature, proposing frameworks and providing learning guarantees. It does not describe any experiments that would require specific hardware, thus no hardware specifications are mentioned. |
| Software Dependencies | No | The paper is theoretical, presenting mathematical frameworks and proofs for learning guarantees. It does not describe specific software implementations or list software dependencies with version numbers. |
| Experiment Setup | No | The paper is theoretical, focusing on frameworks and learning guarantees. It does not describe an experimental setup with specific hyperparameters, training configurations, or system-level settings for empirical evaluation. |