One-Shot Domain Incremental Learning

Autor: Esaki, Yasushi, Koide, Satoshi, Kutsuna, Takuro
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
DOI: 10.1109/IJCNN60899.2024.10650928
Popis: Domain incremental learning (DIL) has been discussed in previous studies on deep neural network models for classification. In DIL, we assume that samples on new domains are observed over time. The models must classify inputs on all domains. In practice, however, we may encounter a situation where we need to perform DIL under the constraint that the samples on the new domain are observed only infrequently. Therefore, in this study, we consider the extreme case where we have only one sample from the new domain, which we call one-shot DIL. We first empirically show that existing DIL methods do not work well in one-shot DIL. We have analyzed the reason for this failure through various investigations. According to our analysis, we clarify that the difficulty of one-shot DIL is caused by the statistics in the batch normalization layers. Therefore, we propose a technique regarding these statistics and demonstrate the effectiveness of our technique through experiments on open datasets.
Comment: accepted at IEEE International Joint Conference on Neural Networks (IJCNN) 2024
Databáze: arXiv