Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1]
Accelerated Alternating Projections for Robust Principal Component Analysis
Authors: HanQin Cai, Jian-Feng Cai, Ke Wei
JMLR 2019 | Venue PDF | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Theoretical recovery guarantee has been established which shows linear convergence of the proposed algorithm. Empirical performance evaluations establish the advantage of our algorithm over other state-of-the-art algorithms for robust PCA. |
| Researcher Affiliation | Academia | Han Qin Cai EMAIL Department of Mathematics University of California, Los Angeles Los Angeles, California, USA Jian-Feng Cai EMAIL Department of Mathematics Hong Kong University of Science and Technology Hong Kong SAR, China Ke Wei EMAIL School of Data Science Fudan University Shanghai, China |
| Pseudocode | Yes | Algorithm 1 Robust PCA by Accelerated Alternating Projections (Acc Alt Proj) Algorithm 2 Trim Algorithm 3 Initialization by Two Steps of Alt Proj |
| Open Source Code | Yes | The codes for Acc Alt Proj can be found online: https://github.com/caesarcai/Acc Alt Proj_for_RPCA. |
| Open Datasets | Yes | The two videos we have used for this test are Shoppingmall and Restaurant which can be found online5. Website: perception.i2r.a-star.edu.sg/bk_model/bk_index.html. |
| Dataset Splits | No | The paper describes generating synthetic datasets and using videos for background subtraction, which involve decomposing a matrix into low-rank and sparse components. It defines how the synthetic data is generated (e.g., L = P Q T, sparse entries sampled uniformly) and applies algorithms to the full input matrix D. However, it does not specify traditional training, testing, or validation splits for these datasets. The objective is reconstruction from the given D, not a supervised learning task with explicit data splits. |
| Hardware Specification | Yes | The tests are conducted on a laptop equipped with 64-bit Windows 7, Intel i7-4712HQ (4 Cores at 2.3 GHz) and 16GB DDR3L-1600 RAM, and executed from MATLAB R2017a. |
| Software Dependencies | Yes | The tests are conducted on a laptop equipped with 64-bit Windows 7, Intel i7-4712HQ (4 Cores at 2.3 GHz) and 16GB DDR3L-1600 RAM, and executed from MATLAB R2017a. ... we used the PROPACK library4 for this task when the size of D is large and r is relatively small. |
| Experiment Setup | Yes | The test algorithms are terminated when either the relative computing error is smaller than a tolerance, errk < tol, or a maximum number of 100 iterations is reached. ... We set tol = 10 6 in the stopping condition for all the test algorithms. The backtracking line search has been used in GD ... The parameters β and βinit are set to be β = 1.1µr 2 mn and βinit = 1.1µr mn in our experiments, and γ = 0.5 is used when α < 0.55 and γ = 0.65 is used when α 0.55. We set tol = 10 4 in the stopping criteria. |