Bimodal Depth-First Search for Scalable GAC for AllDifferent

Authors: Sulian Le Bozec-Chiffoleau, Nicolas Beldiceanu, Charles Prud'homme, Gilles Simonin, Xavier Lorca

IJCAI 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We compared the resolution times of N-Queens and Latin-Squares, as we can solve most instances within our size range. Figure 5 reports the time in seconds to find the first solution. For each algorithm, a dot represents a solution found. Both problems show that COMP and TUNED are faster than CLASSIC and PARTIAL, up to 14 times faster.
Researcher Affiliation Academia IMT Atlantique, LS2N, UMR CNRS 6004, F-44307 Nantes, France 2Centre G enie Industriel, IMT Mines Albi, Universit e de Toulouse, Albi EMAIL, EMAIL
Pseudocode Yes Algorithm 1 Bimodal BFS (Bi-BFS) ... Algorithm 2 Bimodal DFS (Bi-DFS)
Open Source Code Yes We integrated our algorithms into the open source Choco-solver [Prud homme and Fages, 2022].2 ... 2https://github.com/Sulian LBC/Bimodal All Diff-IJCAI-2025
Open Datasets Yes We study scalability by analysing four well-known problems that mainly use the ALLDIFFERENT constraint: N-Queens, Latin-Squares, Langford (where k is set to 2), and Golomb Ruler... 3queens4 model from https://www.hakank.org/minizinc and latin-squares-fd2, langford and golomb models from https://github.com/Mini Zinc/minizinc-benchmarks
Dataset Splits No We study scalability by analysing four well-known problems that mainly use the ALLDIFFERENT constraint: N-Queens, Latin-Squares, Langford (where k is set to 2), and Golomb Ruler,3 where we gradually increase the size of a parameter. (Explanation: The paper deals with constraint programming problems, not typical machine learning datasets that are split into train/test/validation sets. The "datasets" here are problem instances, whose sizes are varied, not partitioned.)
Hardware Specification Yes The experiments were carried out on an Intel Xeon 6230 with 6144M RAM per job.
Software Dependencies No We integrated our algorithms into the open source Choco-solver [Prud homme and Fages, 2022]. (Explanation: The paper mentions "Choco-solver" but does not provide a specific version number for it, only a citation to its paper.)
Experiment Setup Yes We considered the following strategies for our bimodal approach: CLASSIC: f(x) = TRUE, x X. ... TUNED: f(x) = (|D(x)| < p |L|), x X. ... We set a 20 minutes (1200 seconds) time limit for the experiments. ... Each filtering algorithm is paired with a higher priority instantiation propagator that removes instantiated values from the domains of other variables...