Zobrazeno 1 - 8
of 8
pro vyhledávání: '"Zhang, Yunbei"'
Continual Test-Time Adaptation (CTTA) seeks to adapt a source pre-trained model to continually changing, unlabeled target domains. Existing TTA methods are typically designed for environments where domain changes occur sequentially and can struggle i
Externí odkaz:
http://arxiv.org/abs/2406.10737
Vision Transformers (ViTs) have demonstrated remarkable capabilities in learning representations, but their performance is compromised when applied to unseen domains. Previous methods either engage in prompt learning during the training phase or modi
Externí odkaz:
http://arxiv.org/abs/2407.09498
Gauging the performance of ML models on data from unseen domains at test-time is essential yet a challenging problem due to the lack of labels in this setting. Moreover, the performance of these models on in-distribution data is a poor indicator of t
Externí odkaz:
http://arxiv.org/abs/2405.01451
Achieving high accuracy on data from domains unseen during training is a fundamental challenge in domain generalization (DG). While state-of-the-art DG classifiers have demonstrated impressive performance across various tasks, they have shown a bias
Externí odkaz:
http://arxiv.org/abs/2307.08551
The development of reliable and fair diagnostic systems is often constrained by the scarcity of labeled data. To address this challenge, our work explores the feasibility of unsupervised domain adaptation (UDA) to integrate large external datasets fo
Externí odkaz:
http://arxiv.org/abs/2307.03157
The growing popularity of transfer learning, due to the availability of models pre-trained on vast amounts of data, makes it imperative to understand when the knowledge of these pre-trained models can be transferred to obtain high-performing models o
Externí odkaz:
http://arxiv.org/abs/2307.00823
Deep learning-based diagnostic system has demonstrated potential in classifying skin cancer conditions when labeled training example are abundant. However, skin lesion analysis often suffers from a scarcity of labeled data, hindering the development
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::c6f90e70520261347bcb4aea992d6810
http://arxiv.org/abs/2307.03157
http://arxiv.org/abs/2307.03157
Transfer learning transfers the knowledge acquired by a model from a source task to multiple downstream target tasks with minimal fine-tuning. The success of transfer learning at improving performance, especially with the use of large pre-trained mod
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::12b6b3724b862329a8fbb78303250bbf