Beyond ordinary Lipschitz constraints: Differentially Private optimization with TNC

Authors: Difei Xu, Meng Ding, Zihang Xiang, Jinhui Xu, Di Wang

TMLR 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In this section, we conduct a series of numerical experiments with three different datasets to show the performance of our algorithms. ... Figures 1, 2, and 3 present the results of ℓ4-norm linear regression for our proposed methods (LNC-GM and PNCA-SGD) compared to the baseline DP-SGD algorithm.
Researcher Affiliation Academia Difei Xu EMAIL Department of Statistics King Abdullah University of Science and Technology Meng Ding EMAIL Department of Computer Sciecne State University of New York at Buffalo Zihang Xiang EMAIL Department of Computer Sciecne King Abdullah University of Science and Technology Jinhui Xu EMAIL School of Information Science and Technology University of Science and Technology of China Di Wang EMAIL Department of Computer Science King Abdullah University of Science and Technology
Pseudocode Yes Algorithm 1 Clipped Mean({zi}n i=1, n, C) ... Algorithm 2 Clipped Regularized Gradient Method ... Algorithm 3 Localized Noisy Clipped Gradient Method for DP-SCO(LNC-GM)(w0, η, n, W) ... Algorithm 4 Private Stochastic Approximation(w1, n, R0) ... Algorithm 5 Iterated Localized Noisy Clipped Gradient Method ... Algorithm 6 Permuted Noisy Clipped Accelerated SGD for Heavy-Tailed DP SCO (PNCA-SGD) ... Algorithm 7 Iterated PNCA-SGD w0, n, W, θ
Open Source Code No The paper does not provide any statement regarding the availability of source code or a link to a code repository.
Open Datasets Yes We will implement the LNC-GM algorithm on three real-world datasets from the libsvm website, namely a8a (n = 22, 696, d = 123 for training, and n = 9, 865 for testing), a9a (n = 32, 561, d = 123 for training, and n = 16, 281 for testing),and w7a (n = 24, 692, d = 300 for training, and n = 25, 057 for testing).
Dataset Splits Yes Dataset and Parameter Settings We will implement the LNC-GM algorithm on three real-world datasets from the libsvm website, namely a8a (n = 22, 696, d = 123 for training, and n = 9, 865 for testing), a9a (n = 32, 561, d = 123 for training, and n = 16, 281 for testing),and w7a (n = 24, 692, d = 300 for training, and n = 25, 057 for testing).
Hardware Specification No The paper does not provide specific details about the hardware used for the experiments.
Software Dependencies No The paper does not provide specific details about software dependencies, such as libraries or frameworks with version numbers.
Experiment Setup Yes Our experimental framework incorporates systematic hyperparameter tuning to optimize results, with findings reported using some selected parameters. ... We study the above-mentioned TNC problem and their corresponding testing errors with various sample sizes and privacy budgets ε. When performing the results for different sample sizes, we will fix ε = 8 and consider different sample sizes n that are at most 3.5 10^4. When performing the results for different privacy budgets ε, we will use n = 10^4 samples and choose ε = {0.5, 1.0, 1.5, 2.0, 3.0, 4.0, 5.0} respectively. We will fix δ = 1/n^{1.1} for all experiments.