AnalogGenie: A Generative Engine for Automatic Discovery of Analog Circuit Topologies
Authors: Jian Gao, Weidong Cao, Junyi Yang, Xuan Zhang
ICLR 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experimental results show the remarkable generation performance of Analog Genie in broadening the variety of analog ICs, increasing the number of devices within a single design, and discovering unseen circuit topologies far beyond any prior arts. |
| Researcher Affiliation | Academia | 1 Northeastern University, 2 The George Washington University EMAIL {weidong.cao}@gwu.edu |
| Pseudocode | No | The paper describes methods in prose and includes theoretical proofs (Theorem 3.2.1) but does not provide structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | Our source code is available at https://github.com/xz-group/Analog Genie. |
| Open Datasets | No | For our open-sourced circuit dataset, we provide its statistics in Appendix A.1. We will make our code and dataset public on Github in the future. |
| Dataset Splits | Yes | During training, we first split the topology data set into train and validation sets with a 9 to 1 ratio. |
| Hardware Specification | No | The paper does not provide specific details about the hardware used for running experiments, such as GPU or CPU models. |
| Software Dependencies | No | The paper mentions using a 'GPT model' and 'Ngspice simulation infrastructure' but does not specify any version numbers for software dependencies. |
| Experiment Setup | Yes | Our Analog Genie model is a decoder-only transformer consisting of 6 hidden layers and 6 attention heads with 11.825 million parameters in total. The vocab size is 1029. The maximum sequence length is 1024. |