Group SLOPE Penalized Low-Rank Tensor Regression

Authors: Yang Chen, Ziyan Luo

JMLR 2023 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Numerical studies conducted on synthetic data and a real human brain connection data illustrate the efficacy of the proposed Tg SLOPE estimation procedure. Keywords: difference-of-convex, false discovery rate, group sparsity, low-rankness, tensor regression
Researcher Affiliation Academia Yang Chen EMAIL Ziyan Luo EMAIL School of Mathematics and Statistics Beijing Jiaotong University Beijing, P. R. China
Pseudocode Yes Algorithm 1 p DCAe for solving Tg SLOPE (4)
Open Source Code No The paper does not provide concrete access to source code for the methodology described. It mentions that Fast Prox SL1 can be used to solve the W-subproblem and references a prior work, but does not state that the authors' implementation for Tg SLOPE is open-sourced.
Open Datasets Yes In this subsection, we test our proposed Tg SLOPE comparing with the other three approaches on a real human brain connection (HBC) data from the Human Connectome Project (HCP), which aims to build a network map between the anatomical and functional connectivity within healthy human brains (Van Essen et al., 2013). The preprocessed HBC data set is provided by Hu et al. (2022)... (The HBC data can be found at https://wiki.humanconnectome.org/display/Public Data/).
Dataset Splits Yes For each LROD rank selected from 1 to 12, HBC data set is randomly divided into 80 percent of training set and 20 percent of testing set 20 times.
Hardware Specification Yes All numerical experiments are implemented in MATLAB (R2021a), running on a laptop with Intel Core i5-8265U CPU (1.60GHz) and 16 GB RAM.
Software Dependencies Yes All numerical experiments are implemented in MATLAB (R2021a), running on a laptop with Intel Core i5-8265U CPU (1.60GHz) and 16 GB RAM.
Experiment Setup Yes Set n = 2000, p = 1000, p1 = p2 = 10, K = 20 and sparsity s = 25 : 25 : 250. We perform 100 independent replications for each sparsity and target Tg FDR level (q = 0.05, 0.1). For Tg LASSO, the tuning parameter is selected by 5-fold cross-validation. We choose the tuning parameters via a BIC-type criterion on the whole data set, which minimizes BIC = Y ˆB 3 X 2 F + ˆB f 0 + p1 + p2 K log(np1p2). Set the tolerance for termination in algorithms to be ϵ = 10 4.