StreamAdapter: Efficient Test Time Adaptation from Contextual Streams
Autor: | Muhtar, Dilxat, Shen, Yelong, Yang, Yaming, Liu, Xiaodong, Lu, Yadong, Liu, Jianfeng, Zhan, Yuefeng, Sun, Hao, Deng, Weiwei, Sun, Feng, Zhang, Xueliang, Gao, Jianfeng, Chen, Weizhu, Zhang, Qi |
---|---|
Rok vydání: | 2024 |
Předmět: | |
Druh dokumentu: | Working Paper |
Popis: | In-context learning (ICL) allows large language models (LLMs) to adapt to new tasks directly from the given demonstrations without requiring gradient updates. While recent advances have expanded context windows to accommodate more demonstrations, this approach increases inference costs without necessarily improving performance. To mitigate these issues, We propose StreamAdapter, a novel approach that directly updates model parameters from context at test time, eliminating the need for explicit in-context demonstrations. StreamAdapter employs context mapping and weight absorption mechanisms to dynamically transform ICL demonstrations into parameter updates with minimal additional parameters. By reducing reliance on numerous in-context examples, StreamAdapter significantly reduce inference costs and allows for efficient inference with constant time complexity, regardless of demonstration count. Extensive experiments across diverse tasks and model architectures demonstrate that StreamAdapter achieves comparable or superior adaptation capability to ICL while requiring significantly fewer demonstrations. The superior task adaptation and context encoding capabilities of StreamAdapter on both language understanding and generation tasks provides a new perspective for adapting LLMs at test time using context, allowing for more efficient adaptation across scenarios and more cost-effective inference Comment: 22 Pages, 9 Figures |
Databáze: | arXiv |
Externí odkaz: |