Verification Learning: Make Unsupervised Neuro-Symbolic System Feasible

Authors: Lin-Han Jia, Wen-Chao Hu, Jie-Jing Shao, Lan-Zhe Guo, Yu-Feng Li

ICML 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental To validate the effectiveness of the framework we proposed, we conducted experiments on 4 unsupervised tasks. These experiments were primarily extensions of previously supervised Nesy tasks. For all tasks, we used Le Net as the basic network architecture (denoted as f) for symbol recognition from X to S, with a learning rate of 0.001 and Adam optimizer for optimization.
Researcher Affiliation Academia 1National Key Laboratory for Novel Software Technology, Nanjing University, Nanjing, China 2School of Artificial Intelligence, Nanjing University, Nanjing, China 3School of Intelligence Science and Technology, Nanjing University, Nanjing, China. Correspondence to: Yu-Feng Li <EMAIL>, Lan-Zhe Guo <EMAIL>.
Pseudocode Yes B. The Programs of All of the Verification Function # Addition def digits_to_number(digits,num_classes=2): number = 0 for d in digits: number *= num_classes number += d return number
Open Source Code Yes All of the code is open-sourced on the github https://github.com/Verification Learning/Verification Learning.
Open Datasets Yes We conducted experiments on four rule-based tasks without labels, and made groundbreaking progress. We were able to: identify the numbers in addition expressions based solely on the addition rule (Manhaeve et al., 2018); recognize numbers in ordered sequences based solely on the sort rule (Winters et al., 2022); identify characters in strings based solely on the string match rule (Dai et al., 2019); and identify chess pieces on a chessboard based solely on the chess rule.
Dataset Splits No The paper uses "unlabeled data" and mentions "input dataset Xtrain = [(X1, Y1), . . . , (Xn, Yn)]" but does not specify any training, validation, or test splits, nor their sizes or percentages.
Hardware Specification Yes All experiments were completed on 4 A800 GPUs.
Software Dependencies No For all tasks, we used Le Net as the basic network architecture (denoted as f) for symbol recognition from X to S, with a learning rate of 0.001 and Adam optimizer for optimization.
Experiment Setup Yes For all tasks, we used Le Net as the basic network architecture (denoted as f) for symbol recognition from X to S, with a learning rate of 0.001 and Adam optimizer for optimization. Due to the fact that many algorithms train extremely slowly, in order to conduct a performance comparison with them as comprehensively as possible, we set a unified number of epochs to 10.