Learning from Logical Constraints with Lower- and Upper-Bound Arithmetic Circuits
Authors: Lucile Dierckx, Alexandre Dubray, Siegfried Nijssen
IJCAI 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our experiments show that adding the upperbound AC also helps the learning process in practice, allowing for similar or better generalisation than working solely with fully compiled ACs, even with less than 150 seconds of partial compilation. 5 Experiments This section describes the solver and datasets used in our experiments. Then, we answer the questions: (I) How well does LUBAC learning generalise? (II) How are the initial gradients impacted by which AC is used? (III) What is the time overhead to compile both lowerand upper-bound ACs? |
| Researcher Affiliation | Academia | 1 ICTEAM, UCLouvain, Belgium 2 TRAIL Institute, Belgium 3 DTAI, KU Leuven, Leuven, Belgium EMAIL |
| Pseudocode | Yes | Algorithm 1 Compilation algorithm from a search trace |
| Open Source Code | Yes | 1https://github.com/aia-uclouvain/schlandals |
| Open Datasets | Yes | The probabilistic queries in our experiments originate from Bayesian Networks for which we wish to learn parameters that lead to desired marginal probabilities. This is a wellknown problem, and we use networks from the bnlearn R package [Scutari, 2009]. |
| Dataset Splits | Yes | Appendix C (available here) provides the full list of training parameters and train-test split. |
| Hardware Specification | No | No specific hardware details (like GPU/CPU models) are provided for running experiments. The paper only vaguely mentions 'hardware utilisation' without specifics. |
| Software Dependencies | No | The paper mentions implementing parts into the 'Schlandals solver [Dubray et al., 2023]' and using 'networks from the bnlearn R package [Scutari, 2009]', but does not provide specific version numbers for these software components or any other libraries. |
| Experiment Setup | No | Appendix C (available here) provides the full list of training parameters and train-test split. While parameters are mentioned as available, they are specified to be in the appendix, not the main text. |