Zobrazeno 1 - 2
of 2
pro vyhledávání: '"Ding, Muhe"'
Multimodal Large Language Models (MLLMs) have recently received substantial interest, which shows their emerging potential as general-purpose models for various vision-language tasks. MLLMs involve significant external knowledge within their paramete
Externí odkaz:
http://arxiv.org/abs/2410.14154
Knowledge distillation is a mainstream algorithm in model compression by transferring knowledge from the larger model (teacher) to the smaller model (student) to improve the performance of student. Despite many efforts, existing methods mainly invest
Externí odkaz:
http://arxiv.org/abs/2410.14143