Zobrazeno 1 - 10
of 521
pro vyhledávání: '"Bursuc"'
Domain adaptation has been extensively investigated in computer vision but still requires access to target data at the training time, which might be difficult to obtain in some uncommon conditions. In this paper, we present a new framework for domain
Externí odkaz:
http://arxiv.org/abs/2410.21361
We consider the problem of adapting a contrastively pretrained vision-language model like CLIP (Radford et al., 2021) for few-shot classification. The existing literature addresses this problem by learning a linear classifier of the frozen visual fea
Externí odkaz:
http://arxiv.org/abs/2410.05270
Autor:
Vu, Tuan-Hung, Valle, Eduardo, Bursuc, Andrei, Kerssies, Tommie, de Geus, Daan, Dubbelman, Gijs, Qian, Long, Zhu, Bingke, Chen, Yingying, Tang, Ming, Wang, Jinqiao, Vojíř, Tomáš, Šochman, Jan, Matas, Jiří, Smith, Michael, Ferrie, Frank, Basu, Shamik, Sakaridis, Christos, Van Gool, Luc
We propose the unified BRAVO challenge to benchmark the reliability of semantic segmentation models under realistic perturbations and unknown out-of-distribution (OOD) scenarios. We define two categories of reliability: (1) semantic reliability, whic
Externí odkaz:
http://arxiv.org/abs/2409.15107
Autor:
de Moreau, Simon, Almehio, Yasser, Bursuc, Andrei, El-Idrissi, Hafid, Stanciulescu, Bogdan, Moutarde, Fabien
Nighttime camera-based depth estimation is a highly challenging task, especially for autonomous driving applications, where accurate depth perception is essential for ensuring safe navigation. We aim to improve the reliability of perception systems a
Externí odkaz:
http://arxiv.org/abs/2409.08031
This paper introduces FUNGI, Features from UNsupervised GradIents, a method to enhance the features of transformer encoders by leveraging self-supervised gradients. Our method is simple: given any pretrained model, we first compute gradients from var
Externí odkaz:
http://arxiv.org/abs/2407.10964
Autor:
Wysoczańska, Monika, Vobecky, Antonin, Cardiel, Amaia, Trzciński, Tomasz, Marlet, Renaud, Bursuc, Andrei, Siméoni, Oriane
Recent VLMs, pre-trained on large amounts of image-text pairs to align both modalities, have opened the way to open-vocabulary semantic segmentation. Given an arbitrary set of textual queries, image regions are assigned the closest query in feature s
Externí odkaz:
http://arxiv.org/abs/2407.05061
Autor:
Xu, Yihong, Zablocki, Éloi, Boulch, Alexandre, Puy, Gilles, Chen, Mickael, Bartoccioni, Florent, Samet, Nermin, Siméoni, Oriane, Gidaris, Spyros, Vu, Tuan-Hung, Bursuc, Andrei, Valle, Eduardo, Marlet, Renaud, Cord, Matthieu
Motion forecasting is crucial in autonomous driving systems to anticipate the future trajectories of surrounding agents such as pedestrians, vehicles, and traffic signals. In end-to-end forecasting, the model must jointly detect and track from sensor
Externí odkaz:
http://arxiv.org/abs/2406.08113
Autor:
Iordache Cristina, Ana Maria Fătu, Surlari Zenovia, Toma Vasilica, Bursuc Ana Maria, Ancuța Codrina
Publikováno v:
Romanian Journal of Medical and Dental Education, Vol 7, Iss 2, Pp 34-41 (2019)
The main aims of this paper were (i) to identify the pathology that occurs in the dentist’s hand in relation to different professional risk factors, particularly the trapezio-metacarpal osteoarthritis – a painful and highly disabling entity relat
Externí odkaz:
https://doaj.org/article/ceea06b714514105ae1fd73037ac27a9
Autor:
Sirko-Galouchenko, Sophia, Boulch, Alexandre, Gidaris, Spyros, Bursuc, Andrei, Vobecky, Antonin, Pérez, Patrick, Marlet, Renaud
We introduce a self-supervised pretraining method, called OccFeat, for camera-only Bird's-Eye-View (BEV) segmentation networks. With OccFeat, we pretrain a BEV network via occupancy prediction and feature distillation tasks. Occupancy prediction prov
Externí odkaz:
http://arxiv.org/abs/2404.14027
A Proof of Secure Erasure (PoSE) is a communication protocol where a verifier seeks evidence that a prover has erased its memory within the time frame of the protocol execution. Designers of PoSE protocols have long been aware that, if a prover can o
Externí odkaz:
http://arxiv.org/abs/2401.06626