Iteration Complexity of Feasible Descent Methods for Convex Optimization
Authors: Po-Wei Wang, Chih-Jen Lin
JMLR 2014 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | In this paper, we proved the global linear convergence on a wide range of algorithms when they are applied to some non-strongly convex problems. In particular, we are the first to prove O(log(1/ϵ)) time complexity of cyclic coordinate descent methods on dual problems of support vector classification and regression. ... Our main contribution is a global error bound for the non-strongly convex problem (1), which ensures the global linear convergence of feasible descent methods. The main theorems are presented in Section 2, followed by examples in Section 3. The global error bound is discussed in Section 4, and the proof of global linear convergence of feasible descent methods is given in Section 5. |
| Researcher Affiliation | Academia | Po-Wei Wang EMAIL Chih-Jen Lin EMAIL Department of Computer Science National Taiwan University Taipei 106, Taiwan |
| Pseudocode | No | The paper describes algorithms like cyclic coordinate descent and Gauss-Seidel methods (Section 3) through mathematical definitions and textual descriptions, but it does not include explicitly labeled 'Pseudocode' or 'Algorithm' blocks with structured, code-like formatting. |
| Open Source Code | No | The paper does not contain any statements or links indicating that source code for the described methodology is publicly available. |
| Open Datasets | No | The paper discusses theoretical applications to problems such as the 'dual form of L1-loss linear SVM' and 'dual form of L1-loss m-insensitive SVR,' but it does not describe or use any specific datasets in an experimental context, nor does it provide access information for any publicly available or open datasets. |
| Dataset Splits | No | The paper is theoretical and does not involve experimental evaluation on datasets, therefore it does not provide any information about dataset splits. |
| Hardware Specification | No | The paper is entirely theoretical, focusing on mathematical proofs and convergence analysis of algorithms. It does not describe any experimental setup or specify any hardware used for running experiments. |
| Software Dependencies | No | The paper is theoretical and focuses on mathematical proofs and convergence analysis, thus it does not specify any software dependencies or versions used for empirical evaluations. |
| Experiment Setup | No | The paper is theoretical and focuses on algorithm analysis and proofs, without presenting any experimental results. Therefore, it does not include details on experimental setup, hyperparameters, or training configurations. |