Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1]

Refined approachability algorithms and application to regret minimization with global costs

Authors: Joon Kwon

JMLR 2021 | Venue PDF | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We construct and analyze a class of Follow the Regularized Leader algorithms (FTRL) for Blackwell s approachability which are able to minimize not only the Euclidean distance to the target set (as it is often the case in the context of Blackwell s approachability) but a wide range of distance-like quantities. This flexibility enables us to apply these algorithms to closely minimize the quantity of interest in various online learning problems. In particular, for regret minimization with ℓp global costs, we obtain the first bounds with explicit dependence in p and the dimension d.
Researcher Affiliation Academia Joon Kwon EMAIL MIA Paris, INRAE & Agro Paris Tech 16 rue Claude Bernard, 75231 Paris, France
Pseudocode No The associated algorithm is then defined for t 1 as: compute xt = arg max x X [ ηt 1 Pt s=1 rs, x h(x) ] compute at = a (xt) observe rt = r(at, bt), where the first line is well-defined thanks to the basic properties of regularizers gathered in Proposition 20.
Open Source Code No No explicit statement about source code availability was found in the paper.
Open Datasets No The paper discusses theoretical models and algorithms (e.g., problem of regret minimization with global costs, online combinatorial optimization). It does not mention any specific publicly available datasets used for empirical evaluation. For example, in Section 4.1, it describes 'the Decision Maker chooses distribution at d; the Environment chooses loss vector ℓt [0, 1]d' as part of the theoretical problem setup, not as a dataset being used.
Dataset Splits No The paper is theoretical and does not involve empirical experiments with datasets, therefore there is no mention of dataset splits.
Hardware Specification No The paper is theoretical and focuses on algorithm design and analysis, without conducting empirical experiments. Therefore, no hardware specifications are mentioned.
Software Dependencies No The paper is theoretical and focuses on algorithm design and analysis, without conducting empirical experiments that would require specific software dependencies or versions.
Experiment Setup No The paper is theoretical and focuses on algorithm design and analysis. It does not describe any empirical experiments, and therefore no experimental setup details like hyperparameters or training configurations are provided.