Low-Rank Adaptation for Foundation Models: A Comprehensive Review

Autor: Yang, Menglin, Chen, Jialin, Zhang, Yifei, Liu, Jiahong, Zhang, Jiasheng, Ma, Qiyao, Verma, Harshit, Zhang, Qianru, Zhou, Min, King, Irwin, Ying, Rex
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: The rapid advancement of foundation modelslarge-scale neural networks trained on diverse, extensive datasetshas revolutionized artificial intelligence, enabling unprecedented advancements across domains such as natural language processing, computer vision, and scientific discovery. However, the substantial parameter count of these models, often reaching billions or trillions, poses significant challenges in adapting them to specific downstream tasks. Low-Rank Adaptation (LoRA) has emerged as a highly promising approach for mitigating these challenges, offering a parameter-efficient mechanism to fine-tune foundation models with minimal computational overhead. This survey provides the first comprehensive review of LoRA techniques beyond large Language Models to general foundation models, including recent techniques foundations, emerging frontiers and applications of low-rank adaptation across multiple domains. Finally, this survey discusses key challenges and future research directions in theoretical understanding, scalability, and robustness. This survey serves as a valuable resource for researchers and practitioners working with efficient foundation model adaptation.
Databáze: arXiv