Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1]
A Closer Look at Adaptive Regret
Authors: Dmitry Adamskiy, Wouter M. Koolen, Alexey Chernov, Vladimir Vovk
JMLR 2016 | Venue PDF | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We compute the exact worst-case adaptive regret of Fixed Share. We re-derive the tracking regret bounds from these adaptive regret bounds, showing that the latter are in fact more fundamental. We prove that Fixed Share is optimal for adaptive regret: the worst-case adaptive regret of any algorithm is at least that of an instance of Fixed Share. |
| Researcher Affiliation | Academia | Dmitry Adamskiy EMAIL Computer Learning Research Centre and Department of Computer Science, Royal Holloway, University of London, Egham, Surrey, TW20 0EX, UK; Wouter M. Koolen EMAIL Centrum Wiskunde & Informatica Science Park 123, 1098XG Amsterdam, The Netherlands; Alexey Chernov EMAIL School of Computing, Engineering and Mathematics, University of Brighton Moulsecoomb, Brighton, BN2 4GJ, UK; Vladimir Vovk EMAIL Computer Learning Research Centre and Department of Computer Science, Royal Holloway, University of London, Egham, Surrey, TW20 0EX, UK |
| Pseudocode | Yes | Algorithm 1 Adaptive Aggregating Algorithm Input: Prior nonnegative weights p(t), t = 1, 2, . . . , with p(1) > 0 vn 1 := p(1), n = 1, . . . , N for t = 1, 2, . . . do Play weights un t := vn t PN j=1 vj t Read the experts losses ℓn t , n = 1, . . . , N Set vn t+1 := p(t + 1) + vn t e ℓn t PN j=1 uj te ℓj t , n = 1, . . . , N |
| Open Source Code | No | The paper does not provide any statement about releasing code, nor does it include links to a code repository. |
| Open Datasets | No | The paper is theoretical and does not describe experiments using specific datasets. Therefore, there is no mention of publicly available datasets or access information. |
| Dataset Splits | No | As the paper describes theoretical work and does not perform experiments on specific datasets, there is no discussion of dataset splits. |
| Hardware Specification | No | The paper describes theoretical research and does not detail any experimental setup requiring specific hardware. |
| Software Dependencies | No | The paper is theoretical and focuses on algorithm design and analysis, without detailing implementation or specific software dependencies with version numbers. |
| Experiment Setup | No | The paper is theoretical, presenting algorithm analysis and bounds rather than experimental results with specific setup details or hyperparameters. |