Overcomplete-to-sparse representation learning for few-shot class-incremental learning.

Autor: Mengying, Fu, Binghao, Liu, Tianren, Ma, Qixiang, Ye
Zdroj: Multimedia Systems; Apr2024, Vol. 30 Issue 2, p1-11, 11p
Abstrakt: Few-shot class-incremental learning (FSCIL) aims to continually learn new semantics given a few training samples of new classes. As training examples are too few to construct good representation upon, FSCIL is required to generalize learned semantics from old to new classes, as well as reduce the representation aliasing between them (old classes ‘forgetting’). This motivates us to develop overcomplete-to-sparse representation learning (O2SRL). It solves the ‘new class generalization’ and ‘old class forgetting’ problems systematically by regularizing both feature completeness and sparsity. Specifically, O2SRL consists of a spatial excitation module (SEM) and a channel purification module (CPM). SEM drives the model to learn overcomplete and generic features, which not only represent all classes well but also benefit generalization to new classes. CPM regularizes the sparsity and uniqueness of features, reducing semantic aliasing between classes and alleviating the forgetting of old classes. These two modules facilitate each other to configure unique and robust representation for both old and new classes. Experiments show that O2SRL improves the state-of-the-art of FSCIL by significant margins on public datasets including CUB200, CIFAR100, and mini-ImageNet. O2SRL’s effectiveness is also validated under the general few-shot learning setting. [ABSTRACT FROM AUTHOR]
Databáze: Complementary Index