Zobrazeno 1 - 1
of 1
pro vyhledávání: '"Mishra, Paramita"'
Low-Rank Adaptation (LoRA) and other parameter-efficient fine-tuning (PEFT) methods provide low-memory, storage-efficient solutions for personalizing text-to-image models. However, these methods offer little to no improvement in wall-clock training t
Externí odkaz:
http://arxiv.org/abs/2412.02352