Zobrazeno 1 - 1
of 1
pro vyhledávání: '"Khani, Nikhil"'
Autor:
Khani, Nikhil, Yang, Shuo, Nath, Aniruddh, Liu, Yang, Abbo, Pendo, Wei, Li, Andrews, Shawn, Kula, Maciej, Kahn, Jarrod, Zhao, Zhe, Hong, Lichan, Chi, Ed
Knowledge Distillation (KD) is a powerful approach for compressing a large model into a smaller, more efficient model, particularly beneficial for latency-sensitive applications like recommender systems. However, current KD research predominantly foc
Externí odkaz:
http://arxiv.org/abs/2408.14678