LaMAGIC: Language-Model-based Topology Generation for Analog Integrated Circuits

Autor: Chang, Chen-Chia, Shen, Yikang, Fan, Shaoze, Li, Jing, Zhang, Shun, Cao, Ningyuan, Chen, Yiran, Zhang, Xin
Rok vydání: 2024
Předmět:
Zdroj: Proceedings of the 41st International Conference on Machine Learning, PMLR 235:6253-6262, 2024
Druh dokumentu: Working Paper
Popis: In the realm of electronic and electrical engineering, automation of analog circuit is increasingly vital given the complexity and customized requirements of modern applications. However, existing methods only develop search-based algorithms that require many simulation iterations to design a custom circuit topology, which is usually a time-consuming process. To this end, we introduce LaMAGIC, a pioneering language model-based topology generation model that leverages supervised finetuning for automated analog circuit design. LaMAGIC can efficiently generate an optimized circuit design from the custom specification in a single pass. Our approach involves a meticulous development and analysis of various input and output formulations for circuit. These formulations can ensure canonical representations of circuits and align with the autoregressive nature of LMs to effectively addressing the challenges of representing analog circuits as graphs. The experimental results show that LaMAGIC achieves a success rate of up to 96\% under a strict tolerance of 0.01. We also examine the scalability and adaptability of LaMAGIC, specifically testing its performance on more complex circuits. Our findings reveal the enhanced effectiveness of our adjacency matrix-based circuit formulation with floating-point input, suggesting its suitability for handling intricate circuit designs. This research not only demonstrates the potential of language models in graph generation, but also builds a foundational framework for future explorations in automated analog circuit design.
Comment: Proceedings of the 41st International Conference on Machine Learning, PMLR 235:6253-6262 https://proceedings.mlr.press/v235/chang24c.html
Databáze: arXiv