Zobrazeno 1 - 10
of 1 161
pro vyhledávání: '"Stathaki A"'
Knowledge distillation (KD) has been widely used to transfer knowledge from large, accurate models (teachers) to smaller, efficient ones (students). Recent methods have explored enforcing consistency by incorporating causal interpretations to distill
Externí odkaz:
http://arxiv.org/abs/2410.09474
Contrastive learning has become a dominant approach in self-supervised visual representation learning. Hard negatives - samples closely resembling the anchor - are key to enhancing learned representations' discriminative power. However, efficiently l
Externí odkaz:
http://arxiv.org/abs/2410.02401
Autor:
Tan, Ashton Yu Xuan, Yang, Yingkai, Zhang, Xiaofei, Li, Bowen, Gao, Xiaorong, Zheng, Sifa, Wang, Jianqiang, Gu, Xinyu, Li, Jun, Zhao, Yang, Zhang, Yuxin, Stathaki, Tania
Enhancing the safety of autonomous vehicles is crucial, especially given recent accidents involving automated systems. As passengers in these vehicles, humans' sensory perception and decision-making can be integrated with autonomous systems to improv
Externí odkaz:
http://arxiv.org/abs/2408.16315
The design of pedestrian detectors seldom considers the unique characteristics of this task and usually follows the common strategies for general object detection. To explore the potential of these characteristics, we take the perspective effect in p
Externí odkaz:
http://arxiv.org/abs/2408.13646
Autor:
Yuan, Jing, Stathaki, Tania
Scribble supervision, a common form of weakly supervised learning, involves annotating pixels using hand-drawn curve lines, which helps reduce the cost of manual labelling. This technique has been widely used in medical image segmentation tasks to fa
Externí odkaz:
http://arxiv.org/abs/2408.13639
Knowledge distillation (KD) is an effective method for transferring knowledge from a large, well-trained teacher model to a smaller, more efficient student model. Despite its success, one of the main challenges in KD is ensuring the efficient transfe
Externí odkaz:
http://arxiv.org/abs/2407.12073
Knowledge Distillation (KD) aims to transfer knowledge from a large teacher model to a smaller student model. While contrastive learning has shown promise in self-supervised learning by creating discriminative representations, its application in know
Externí odkaz:
http://arxiv.org/abs/2407.11802
The abundance of information present in Whole Slide Images (WSIs) renders them an essential tool for survival analysis. Several Multiple Instance Learning frameworks proposed for this task utilize a ResNet50 backbone pre-trained on natural images. By
Externí odkaz:
http://arxiv.org/abs/2405.17446
Autor:
Cheng, Junfeng, Stathaki, Tania
This paper proposes a novel task named "3D part grouping". Suppose there is a mixed set containing scattered parts from various shapes. This task requires algorithms to find out every possible combination among all the parts. To address this challeng
Externí odkaz:
http://arxiv.org/abs/2405.06828
In the field of computer vision, self-supervised learning has emerged as a method to extract robust features from unlabeled data, where models derive labels autonomously from the data itself, without the need for manual annotation. This paper provide
Externí odkaz:
http://arxiv.org/abs/2405.04969