Zobrazeno 1 - 4
of 4
pro vyhledávání: '"Yue, Zichao"'
While graph neural networks (GNNs) have gained popularity for learning circuit representations in various electronic design automation (EDA) tasks, they face challenges in scalability when applied to large graphs and exhibit limited generalizability
Externí odkaz:
http://arxiv.org/abs/2403.01317
Graph transformers (GTs) have emerged as a promising architecture that is theoretically more expressive than message-passing graph neural networks (GNNs). However, typical GT models have at least quadratic complexity and thus cannot scale to large gr
Externí odkaz:
http://arxiv.org/abs/2403.01232
Autor:
Chen, Hongzheng, Zhang, Jiahao, Du, Yixiao, Xiang, Shaojie, Yue, Zichao, Zhang, Niansong, Cai, Yaohui, Zhang, Zhiru
Recent advancements in large language models (LLMs) boasting billions of parameters have generated a significant demand for efficient deployment in inference workloads. The majority of existing approaches rely on temporal architectures that reuse har
Externí odkaz:
http://arxiv.org/abs/2312.15159
Non-volatile memory (NVM) crossbars have been identified as a promising technology, for accelerating important machine learning operations, with matrix-vector multiplication being a key example. Binary neural networks (BNNs) are especially well-suite
Externí odkaz:
http://arxiv.org/abs/2308.06227