Zobrazeno 1 - 6
of 6
pro vyhledávání: '"Wen, Youpeng"'
Recent advancements utilizing large-scale video data for learning video generation models demonstrate significant potential in understanding complex physical dynamics. It suggests the feasibility of leveraging diverse robot trajectory data to develop
Externí odkaz:
http://arxiv.org/abs/2411.09153
Autor:
Long, Yanxin, Wen, Youpeng, Han, Jianhua, Xu, Hang, Ren, Pengzhen, Zhang, Wei, Zhao, Shen, Liang, Xiaodan
Benefiting from large-scale vision-language pre-training on image-text pairs, open-world detection methods have shown superior generalization ability under the zero-shot or few-shot detection settings. However, a pre-defined category space is still r
Externí odkaz:
http://arxiv.org/abs/2303.02489
Autor:
Yao, Lewei, Han, Jianhua, Wen, Youpeng, Liang, Xiaodan, Xu, Dan, Zhang, Wei, Li, Zhenguo, Xu, Chunjing, Xu, Hang
Open-world object detection, as a more general and challenging goal, aims to recognize and localize objects described by arbitrary category names. The recent work GLIP formulates this problem as a grounding problem by concatenating all category names
Externí odkaz:
http://arxiv.org/abs/2209.09407
To bridge the gap between supervised semantic segmentation and real-world applications that acquires one model to recognize arbitrary new concepts, recent zero-shot segmentation attracts a lot of attention by exploring the relationships between unsee
Externí odkaz:
http://arxiv.org/abs/2207.08455
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.
Publikováno v:
Lecture Notes in Computer Science ISBN: 9783030863616
ICANN (1)
ICANN (1)
Real-world data are typically described using multiple modalities or multiple types of descriptors that are considered as multiple views. The data from different modalities locate in different subspaces, therefore the representations associated with
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_________::a6d078f387008431a79a96a7e2ff9eb3
https://doi.org/10.1007/978-3-030-86362-3_32
https://doi.org/10.1007/978-3-030-86362-3_32