Zobrazeno 1 - 10
of 46
pro vyhledávání: '"van de Ven, Gido M."'
Autor:
Hemati, Hamed, Pellegrini, Lorenzo, Duan, Xiaotian, Zhao, Zixuan, Xia, Fangfang, Masana, Marc, Tscheschner, Benedikt, Veas, Eduardo, Zheng, Yuxiang, Zhao, Shiji, Li, Shao-Yuan, Huang, Sheng-Jun, Lomonaco, Vincenzo, van de Ven, Gido M.
Continual learning (CL) provides a framework for training models in ever-evolving environments. Although re-occurrence of previously seen objects or tasks is common in real-world problems, the concept of repetition in the data stream is not often con
Externí odkaz:
http://arxiv.org/abs/2405.04101
This book chapter delves into the dynamics of continual learning, which is the process of incrementally learning from a non-stationary stream of data. Although continual learning is a natural skill for the human brain, it is very challenging for arti
Externí odkaz:
http://arxiv.org/abs/2403.05175
Autor:
Dziadzio, Sebastian, Yıldız, Çağatay, van de Ven, Gido M., Trzciński, Tomasz, Tuytelaars, Tinne, Bethge, Matthias
The ability of machine learning systems to learn continually is hindered by catastrophic forgetting, the tendency of neural networks to overwrite previously acquired knowledge when learning a new task. Existing methods mitigate this problem through r
Externí odkaz:
http://arxiv.org/abs/2312.16731
Diffusion models are powerful generative models that achieve state-of-the-art performance in image synthesis. However, training them demands substantial amounts of data and computational resources. Continual learning would allow for incrementally lea
Externí odkaz:
http://arxiv.org/abs/2311.14028
Autor:
Verwimp, Eli, Aljundi, Rahaf, Ben-David, Shai, Bethge, Matthias, Cossu, Andrea, Gepperth, Alexander, Hayes, Tyler L., Hüllermeier, Eyke, Kanan, Christopher, Kudithipudi, Dhireesha, Lampert, Christoph H., Mundt, Martin, Pascanu, Razvan, Popescu, Adrian, Tolias, Andreas S., van de Weijer, Joost, Liu, Bing, Lomonaco, Vincenzo, Tuytelaars, Tinne, van de Ven, Gido M.
Publikováno v:
Transactions on Machine Learning Research (TMLR), 2024
Continual learning is a subfield of machine learning, which aims to allow machine learning models to continuously learn on new data, by accumulating knowledge without forgetting what was learned in the past. In this work, we take a step back, and ask
Externí odkaz:
http://arxiv.org/abs/2311.11908
Recent years have seen considerable progress in the continual training of deep neural networks, predominantly thanks to approaches that add replay or regularization terms to the loss function to approximate the joint loss over all tasks so far. Howev
Externí odkaz:
http://arxiv.org/abs/2311.04898
Class-incremental learning (CIL) is a particularly challenging variant of continual learning, where the goal is to learn to discriminate between all classes presented in an incremental fashion. Existing approaches often suffer from excessive forgetti
Externí odkaz:
http://arxiv.org/abs/2305.18806
Publikováno v:
Transactions on Machine Learning Research (TMLR), 2024
Continual learning research has shown that neural networks suffer from catastrophic forgetting "at the output level", but it is debated whether this is also the case at the level of learned representations. Multiple recent studies ascribe representat
Externí odkaz:
http://arxiv.org/abs/2304.00933
Autor:
Kao, Ta-Chu, Jensen, Kristopher T., van de Ven, Gido M., Bernacchia, Alberto, Hennequin, Guillaume
Biological agents are known to learn many different tasks over the course of their lives, and to be able to revisit previous tasks and behaviors with little to no loss in performance. In contrast, artificial agents are prone to 'catastrophic forgetti
Externí odkaz:
http://arxiv.org/abs/2106.08085
Publikováno v:
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2021, pp. 3611-3620
Incrementally training deep neural networks to recognize new classes is a challenging problem. Most existing class-incremental learning methods store data or use generative replay, both of which have drawbacks, while 'rehearsal-free' alternatives suc
Externí odkaz:
http://arxiv.org/abs/2104.10093