AOCIL: Exemplar-free Analytic Online Class Incremental Learning with Low Time and Resource Consumption

Autor: Zhuang, Huiping, Liu, Yuchen, He, Run, Tong, Kai, Zeng, Ziqian, Chen, Cen, Wang, Yi, Chau, Lap-Pui
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: Online Class Incremental Learning (OCIL) aims to train the model in a task-by-task manner, where data arrive in mini-batches at a time while previous data are not accessible. A significant challenge is known as Catastrophic Forgetting, i.e., loss of the previous knowledge on old data. To address this, replay-based methods show competitive results but invade data privacy, while exemplar-free methods protect data privacy but struggle for accuracy. In this paper, we proposed an exemplar-free approach -- Analytic Online Class Incremental Learning (AOCIL). Instead of back-propagation, we design the Analytic Classifier (AC) updated by recursive least square, cooperating with a frozen backbone. AOCIL simultaneously achieves high accuracy, low resource consumption and data privacy protection. We conduct massive experiments on four existing benchmark datasets, and the results demonstrate the strong capability of handling OCIL scenarios. Codes will be ready.
Databáze: arXiv