Pseudo-Labeling for Class Incremental Learning

Autor: Lechat, Alexis, Herbin, Stéphane, Jurie, Frédéric
Přispěvatelé: DTIS, ONERA, Université Paris Saclay [Palaiseau], ONERA-Université Paris-Saclay, Université de Caen Normandie (UNICAEN), Normandie Université (NU), Safran Tech
Jazyk: angličtina
Rok vydání: 2021
Předmět:
Zdroj: BMVC 2021 : The British Machine Vision Conference
BMVC 2021 : The British Machine Vision Conference, Nov 2021, virtuel, United Kingdom
Popis: International audience; Class Incremental Learning (CIL) consists in training a model iteratively with limited amount of data from few classes that will never be seen again, resulting in catastrophic forgetting and lack of diversity. In this paper, we address these phenomena by assuming that, during incremental learning, additional unlabeled data are continually available, and propose a Pseudo-Labeling approach for class incremental learning (PLCiL) that makes use of a new adapted loss. We demonstrate that our method achieves better performance than supervised or other semi-supervised methods on standard class incremental benchmarks (CIFAR-100 and ImageNet-100) even when a self-supervised pre-training step using a large set of data is used as initialization. We also illustrate the advantages of our method in a more complex context with fewer labels.
Databáze: OpenAIRE