Zobrazeno 1 - 10
of 4 568
pro vyhledávání: '"Twardowski, A."'
Exemplar-Free Class Incremental Learning (EFCIL) tackles the problem of training a model on a sequence of tasks without access to past data. Existing state-of-the-art methods represent classes as Gaussian distributions in the feature extractor's late
Externí odkaz:
http://arxiv.org/abs/2409.18265
Test-Time Adaptation (TTA) has recently emerged as a promising strategy for tackling the problem of machine learning model robustness under distribution shifts by adapting the model during inference without access to any labels. Because of task diffi
Externí odkaz:
http://arxiv.org/abs/2407.14231
Autor:
Gomez-Villa, Alex, Goswami, Dipam, Wang, Kai, Bagdanov, Andrew D., Twardowski, Bartlomiej, van de Weijer, Joost
Exemplar-free class-incremental learning using a backbone trained from scratch and starting from a small first task presents a significant challenge for continual representation learning. Prototype-based approaches, when continually updated, face the
Externí odkaz:
http://arxiv.org/abs/2407.08536
This paper introduces a continual learning approach named MagMax, which utilizes model merging to enable large pre-trained models to continuously learn from new data without forgetting previously acquired knowledge. Distinct from traditional continua
Externí odkaz:
http://arxiv.org/abs/2407.06322
Autor:
Goswami, Dipam, Soutif--Cormerais, Albin, Liu, Yuyang, Kamath, Sandesh, Twardowski, Bartłomiej, van de Weijer, Joost
Continual learning methods are known to suffer from catastrophic forgetting, a phenomenon that is particularly hard to counter for methods that do not store exemplars of previous tasks. Therefore, to reduce potential drift in the feature extractor, e
Externí odkaz:
http://arxiv.org/abs/2405.19074
Few-shot class-incremental learning (FSCIL) aims to adapt the model to new classes from very few data (5 samples) without forgetting the previously learned classes. Recent works in many-shot CIL (MSCIL) (using all available training data) exploited p
Externí odkaz:
http://arxiv.org/abs/2404.06622
Autor:
Szatkowski, Filip, Yang, Fei, Twardowski, Bartłomiej, Trzciński, Tomasz, van de Weijer, Joost
Continual learning is crucial for applications in dynamic environments, where machine learning models must adapt to changing data distributions while retaining knowledge of previous tasks. Despite significant advancements, catastrophic forgetting - w
Externí odkaz:
http://arxiv.org/abs/2403.07404
We introduce GUIDE, a novel continual learning approach that directs diffusion models to rehearse samples at risk of being forgotten. Existing generative strategies combat catastrophic forgetting by randomly sampling rehearsal examples from a generat
Externí odkaz:
http://arxiv.org/abs/2403.03938
Autor:
Rypeść, Grzegorz, Cygert, Sebastian, Khan, Valeriya, Trzciński, Tomasz, Zieliński, Bartosz, Twardowski, Bartłomiej
Class-incremental learning is becoming more popular as it helps models widen their applicability while not forgetting what they already know. A trend in this area is to use a mixture-of-expert technique, where different models work together to solve
Externí odkaz:
http://arxiv.org/abs/2401.10191
In the field of continual learning, models are designed to learn tasks one after the other. While most research has centered on supervised continual learning, there is a growing interest in unsupervised continual learning, which makes use of the vast
Externí odkaz:
http://arxiv.org/abs/2311.13321