Online Quantile Regression
Authors: Yinan Shen, Dong Xia, Wen-Xin Zhou
JMLR 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive simulation studies corroborate our theoretical findings. 3. Numerical Experiments In this section, we present experiments to empirically validate our theoretical findings. We first generate synthetic data for simulation studies, followed by an application of our proposed methods to a real-data example. |
| Researcher Affiliation | Academia | Yinan Shen EMAIL Department of Mathematics University of Southern California Los Angeles, CA 90089, USA Dong Xia EMAIL Department of Mathematics Hong Kong University of Science and Technology Hong Kong SAR, China Wen-Xin Zhou EMAIL Department of Information and Decision Sciences University of Illinois Chicago Chicago, IL 60607, USA |
| Pseudocode | No | More specifically, the algorithm proceeds as follows. We begin with an arbitrary initialization β0. At each time step t, given the current estimate βt and a newly arrived observation (Xt, Yt), the update is performed as βt+1 = βt ηt (τ I{Yt>Xt,βt} + (1 τ) I{Yt<Xt,βt} + δ I{Yt=Xt,βt}) Xt, where ηt > 0 denotes the stepsize. This text describes the algorithm's update rule but is not presented in a structured pseudocode or algorithm block. |
| Open Source Code | No | No explicit statement or link providing access to the source code for the methodology described in this paper is found. |
| Open Datasets | Yes | In this section, we analyze a real-world dataset on news popularity provided by Moniz and Torgo (2018). Nuno Moniz and Luis Torgo. Multi-source social feedback of online news feeds. ar Xiv preprint ar Xiv:1801.07055, 2018. |
| Dataset Splits | Yes | We randomly split the dataset into a training set with n1 = 20000 observations and a test set with n2 = 9928 observations. |
| Hardware Specification | Yes | For each simulation, the online batch learning algorithm and one-sample learning algorithm take approximately 2 and 10 seconds, respectively, on a Mac Book Pro 2020. On the same Mac Pro, the online one-sample, batch, and offline learning methods take approximately 30, 5, and 520 seconds, respectively. |
| Software Dependencies | No | The paper mentions using the 'quantreg package' but does not specify its version number. No other specific software dependencies with version numbers are provided. |
| Experiment Setup | Yes | For simplicity, the initialization β0 is set to be 0 throughout our numerical studies. In all the experiments, the stepsize for the first phase is scheduled as ηt = (1 0.5/d)tη0, where η0 is the initial stepsize. The choice of stepsize in second phase involves specific parameters where we set Ca = 20 and Cb = 30. We fix the dimension at d = 100, the unknown horizon T = 105, and sample tν-distributed noise with a degree of freedom ν = 1.1. The performance of proposed stepsize scheme is compared with those of two existing alternative stepsize scheme: the ηt = O(1/t) decaying scheme and a constant stepsize scheme ηt const. |