Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1]
Gradient Hard Thresholding Pursuit
Authors: Xiao-Tong Yuan, Ping Li, Tong Zhang
JMLR 2017 | Venue PDF | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive numerical results confirm our theoretical predictions and demonstrate the superiority of our method to the state-of-the-art greedy selection methods in sparse linear regression, sparse logistic regression and sparse precision matrix estimation problems. |
| Researcher Affiliation | Collaboration | Xiao-Tong Yuan EMAIL B-DAT Lab, Nanjing University of Information Science and Technology Nanjing 210044, China Ping Li EMAIL Baidu Research USA Bellevue, WA 98004, USA Tong Zhang EMAIL Tencent AI Lab Shenzhen 518057, China |
| Pseudocode | Yes | Algorithm 1: Gradient Hard Thresholding Pursuit (Gra HTP). |
| Open Source Code | No | The paper does not provide explicit concrete access to source code (e.g., a specific repository link, an explicit code release statement, or code in supplementary materials) for the methodology described in this paper. |
| Open Datasets | Yes | The data used for evaluation include two dense data sets gisette (Guyon et al., 2005) and breast cancer (Hess et al., 2006), and two sparse data sets rcv1.binary (Lewis et al., 2005) and news20.binary (Keerthi & De Coste, 2005). |
| Dataset Splits | Yes | The data are randomly divided into the training and test sets. In each random division, 5 p CR subjects and 16 RD subjects are randomly selected to constitute the test data, and the remaining subjects form the training set with size n = 112. |
| Hardware Specification | Yes | Our algorithms are implemented in Matlab 7.12 running on a desktop with Intel Core i7 3.2G CPU and 16G RAM. |
| Software Dependencies | Yes | Our algorithms are implemented in Matlab 7.12 running on a desktop with Intel Core i7 3.2G CPU and 16G RAM. |
| Experiment Setup | Yes | For each data set, we test with sparsity parameters k {100, 200, ..., 1000} and fix the regularization parameter λ = 10 5. We initialize w(0) = 0 and set the stopping criterion as w(t) w(t 1) / w(t 1) 10 4. |