Derandomizing Multi-Distribution Learning
Authors: Kasper Green Larsen, Omar Montasser, Nikita Zhivotovskiy
NeurIPS 2024 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | The paper is theoretical, and we have no experiments, data or code in the paper. |
| Researcher Affiliation | Academia | Kasper Green Larsen Department of Computer Science Aarhus University EMAIL Omar Montasser Department of Statistics and Data Science Yale University EMAIL Nikita Zhivotovskiy Department of Statistics University of California, Berkeley EMAIL |
| Pseudocode | Yes | Algorithm 1: DETERMINISTICLEARNER(P, ε, δ, A) |
| Open Source Code | No | The paper is theoretical, and we have no experiments, data or code in the paper. |
| Open Datasets | No | The paper is theoretical, and we have no experiments, data or code in the paper. |
| Dataset Splits | No | The paper is theoretical, and we have no experiments, data or code in the paper. |
| Hardware Specification | No | The paper is theoretical, and we have no experiments, data or code in the paper. |
| Software Dependencies | No | The paper is theoretical, and we have no experiments, data or code in the paper. |
| Experiment Setup | No | The paper is theoretical, and we have no experiments, data or code in the paper. |