Online Differentially Private Conformal Prediction for Uncertainty Quantification

Authors: Qiangqiang Zhang, Ting Li, Xinwei Feng, Xiaodong Yan, Jinhan Xie

ICML 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We validate the effectiveness and applicability of the proposed method through comprehensive simulations and real-world studies on the ELEC2 and PAMAP2 datasets.
Researcher Affiliation Academia 1Zhongtai Securities Institute for Financial Studies, Shandong University, Jinan, China 2School of Statistics and Data Science, Shanghai University of Finance and Economics, Shanghai, China 3School of Mathematics and Statistics, Xi an Jiaotong University, Xi an, China 4Yunnan Key Laboratory of Statistical Modeling and Data Analysis, Yunnan University, Kunming, China. Correspondence to: Jinhan Xie <EMAIL>.
Pseudocode Yes We present the pseudocode of our proposed method in Algorithms 1 2. Algorithm 1 computes the non-conformity scores, while Algorithm 2 dynamically updates the quantile estimates in a privacy-preserving manner.
Open Source Code No The paper does not contain an unambiguous statement of code release or a link to a code repository for the methodology described in this paper.
Open Datasets Yes We validate the effectiveness and applicability of the proposed method through comprehensive simulations and real-world studies on the ELEC2 and PAMAP2 datasets. ... Reiss, A. PAMAP2 Physical Activity Monitoring. UCI Machine Learning Repository, 2012. DOI: https://doi.org/10.24432/C5NW2H. ... Harries, M. Splice-2 comparative evaluation: Electricity pricing. Technical report, University of New South Wales, School of Computer Science and Engineering, Sydney, 1999. Accessed: 09 January 2025.
Dataset Splits No The paper describes online processing of data streams and mentions excluding early data points for stability (e.g., 'To minimize the impact of early-stage noise and the initial instability of the algorithm, the first 100 data points are excluded from the analysis.'), but does not provide traditional training/test/validation dataset splits.
Hardware Specification No The paper does not provide specific hardware details such as GPU/CPU models, processor types, or memory amounts used for running experiments.
Software Dependencies No The paper mentions using an 'XGBoost model' and 'third-order autoregressive (AR(3)) model' but does not provide specific software names with version numbers for reproducibility.
Experiment Setup Yes To ensure stability in the learning process under the influence of noise, we introduce a constant c as a lower bound for the parameter Wt. ... Based on these findings, we recommend choosing c within the range of 30 to 50... In our implementation, we instantiate ALFR using a third-order autoregressive (AR(3)) model... A rolling window of size 200 is employed to compute dynamic coverage rates and prediction interval widths at each time step. ... Algorithm 2 Input: Privacy budget parameter ϵt 1 > 0; Miscoverage level α (0, 1); St 1; c > 0 // Initialize parameters (only used on the first call, i.e., when t = 2) Initialize: W0 = 1, λ1 = 0, ˆq1 α 1 = 0.