Structured Semidefinite Programming for Recovering Structured Preconditioners
Authors: Arun Jambulapati, Jerry Li, Christopher Musco, Kirankumar Shiragur, Aaron Sidford, Kevin Tian
NeurIPS 2023 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We develop a general framework for finding approximately-optimal preconditioners for solving linear systems. Leveraging this framework we obtain improved runtimes for fundamental preconditioning and linear system solving problems including: Diagonal preconditioning. We give an algorithm which, given positive definite K Rd d with nnz(K) nonzero entries, computes an ϵ-optimal diagonal preconditioner in time e O(nnz(K) poly(κ , ϵ 1)), where κ is the optimal condition number of the rescaled matrix. Structured linear systems. We give an algorithm which, given M Rd d that is either the pseudoinverse of a graph Laplacian matrix or a constant spectral approximation of one, solves linear systems in M in e O(d2) time. Our diagonal preconditioning results improve state-of-the-art runtimes of Ω(d3.5) attained by general-purpose semidefinite programming, and our solvers improve state-of-the-art runtimes of Ω(dω) where ω > 2.3 is the current matrix multiplication constant. We attain our results via new algorithms for a class of semidefinite programs (SDPs) we call matrix-dictionary approximation SDPs, which we leverage to solve an associated problem we call matrix-dictionary recovery. |
| Researcher Affiliation | Collaboration | Arun Jambulapati Simons Institute EMAIL Christopher Musco New York University EMAIL Jerry Li Microsoft Research EMAIL Kirankumar Shiragur Broad Institute of MIT and Harvard EMAIL Aaron Sidford Stanford University EMAIL Kevin Tian University of Texas at Austin EMAIL |
| Pseudocode | Yes | Algorithm 1: Decide Structured MPC({Mi}i [n], κ, Apack, δ, ϵ) |
| Open Source Code | No | The paper does not provide any concrete access information (e.g., specific repository links, explicit statements of code release, or mention of code in supplementary materials) for open-source code. |
| Open Datasets | No | The paper is theoretical and focuses on algorithm design and analysis. It does not mention the use of any specific dataset for training, validation, or testing. |
| Dataset Splits | No | The paper is theoretical and focuses on algorithm design and analysis. It does not mention any training, validation, or test dataset splits. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., exact GPU/CPU models, processor types) used for running experiments, as the paper is theoretical and does not conduct empirical experiments requiring specific hardware. |
| Software Dependencies | No | The paper does not provide specific ancillary software details, such as library or solver names with version numbers, needed to replicate experiments. The paper is theoretical. |
| Experiment Setup | No | The paper is theoretical and focuses on algorithm design and analysis. It does not describe an experimental setup with specific hyperparameters, training configurations, or system-level settings. |