Compressing Sentence Representation with maximum Coding Rate Reduction
Autor: | Ševerdija, Domagoj, Prusina, Tomislav, Jovanović, Antonio, Borozan, Luka, Maltar, Jurica, Matijević, Domagoj |
---|---|
Rok vydání: | 2023 |
Předmět: | |
Druh dokumentu: | Working Paper |
Popis: | In most natural language inference problems, sentence representation is needed for semantic retrieval tasks. In recent years, pre-trained large language models have been quite effective for computing such representations. These models produce high-dimensional sentence embeddings. An evident performance gap between large and small models exists in practice. Hence, due to space and time hardware limitations, there is a need to attain comparable results when using the smaller model, which is usually a distilled version of the large language model. In this paper, we assess the model distillation of the sentence representation model Sentence-BERT by augmenting the pre-trained distilled model with a projection layer additionally learned on the Maximum Coding Rate Reduction (MCR2)objective, a novel approach developed for general-purpose manifold clustering. We demonstrate that the new language model with reduced complexity and sentence embedding size can achieve comparable results on semantic retrieval benchmarks. Comment: 14 pages, 3 figures, accepted on ICT and Electronics Convention (MIPRO), Croatia |
Databáze: | arXiv |
Externí odkaz: |