Consistent Second-Order Conic Integer Programming for Learning Bayesian Networks
Authors: Simge Kucukyavuz, Ali Shojaie, Hasan Manzour, Linchuan Wei, Hao-Hsiang Wu
JMLR 2023 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our numerical results demonstrate the effectiveness of the proposed approaches. ... In this section, we report the results of our numerical experiments that compare different formulations and evaluate the effect of different tuning parameters and estimation strategies. |
| Researcher Affiliation | Academia | Simge K u c ukyavuz EMAIL Department of Industrial Engineering and Management Sciences Northwestern University; Ali Shojaie EMAIL Department of Biostatistics University of Washington; Hasan Manzour EMAIL Department of Industrial and Systems Engineering University of Washington; Linchuan Wei Linchuan EMAIL Department of Industrial Engineering and Management Sciences Northwestern University; Hao-Hsiang Wu EMAIL Department of Management Science National Yang Ming Chiao Tung University |
| Pseudocode | No | The paper describes methods mathematically and textually but does not include any explicitly labeled pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any explicit statements about code release, a link to a code repository, or mention of code in supplementary materials for the methodology described. |
| Open Datasets | Yes | We use the package pcalg in R to generate random graphs. ... We compare the performance of the methods on twelve publicly available networks from Manzour et al. (2021) and Bayesian Network Repository (bnlearn). |
| Dataset Splits | No | The paper describes how synthetic data is generated and used to learn DAG structures (e.g., 'We consider m {10, 20, 30, 40} nodes and n = 100 samples'). It does not specify explicit training, validation, or test dataset splits of this data for model evaluation in the typical machine learning sense, but rather uses the full generated or public datasets for learning and comparing the resulting DAG structure to a ground truth. |
| Hardware Specification | Yes | Our experiments are performed on a cluster operating on UNIX with Intel Xeon E5-2640v4 2.4GHz. ... These experiments are executed on a laptop with a Windows 10 operating system, an Intel Core i7-8750H 2.2-GHz CPU, 8-GB DRAM using Python 3.8 with Gurobi 9.1.1 Optimizer. |
| Software Dependencies | Yes | All formulations are implemented in the Python programming language. Gurobi 8.1 is used as the solver. ... using Python 3.8 with Gurobi 9.1.1 Optimizer. ... We use the package pcalg in R to generate random graphs. |
| Experiment Setup | Yes | Unless otherwise stated, a time limit of 50m (in seconds), where m denotes the number of nodes, and an MIQP relative optimality gap of 0.01 are imposed across all experiments... Unless otherwise stated, we assume λn = log(n) which corresponds to the Bayesian information criterion (BIC) score. ... we experiment with M = γ max (j,k) E |βR jk| for γ {2, 5, 10} in Table 3... We generate 10 random Erd os-R enyi graphs for each setting (m, n, d). |