Zobrazeno 1 - 10
of 25
pro vyhledávání: '"Apostolos Modas"'
Autor:
Apostolos Modas, Pascal Frossard, Nuno Ferreira Duarte, Konstantinos Chatzilygeroudis, Aude Billard, Andrea Cavallaro, Ricardo Sanchez-Matilla, Alessio Xompero
Publikováno v:
IEEE Robotics and Automation Letters. 5:1642-1649
The real-time estimation through vision of the physical properties of objects manipulated by humans is important to inform the control of robots for performing accurate and safe grasps of objects handed over by humans. However, estimating the 3D pose
Autor:
Apostolos Modas, Rahul Rade, Guillermo Ortiz-Jiménez, Seyed-Mohsen Moosavi-Dezfooli, Pascal Frossard
Publikováno v:
Lecture Notes in Computer Science ISBN: 9783031198052
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_________::1377e5fa34dfba1eb09ee51bccd7f95a
https://doi.org/10.1007/978-3-031-19806-9_36
https://doi.org/10.1007/978-3-031-19806-9_36
We address the problem of distribution shifts in test-time data with a principled data augmentation scheme for the task of content-level classification. In such a task, properties such as shape or transparency of test-time containers (cup or drinking
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::bc2f7b6920b39ccd0a87cf69774724e5
We investigate the problem of classifying - from a single image - the level of content in a cup or a drinking glass. This problem is made challenging by several ambiguities caused by transparencies, shape variations and partial occlusions, and by the
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::3f39ecf7b2d321b1c6e134cca0ced8b4
http://arxiv.org/abs/2102.04057
http://arxiv.org/abs/2102.04057
Driven by massive amounts of data and important advances in computational resources, new deep learning systems have achieved outstanding results in a large spectrum of applications. Nevertheless, our current theoretical understanding on the mathemati
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::d7c29e892ff397982cf8195876902903
http://arxiv.org/abs/2010.09624
http://arxiv.org/abs/2010.09624
Autonomous Vehicles rely on accurate and robust sensor observations for safety critical decision-making in a variety of conditions. Fundamental building blocks of such systems are sensors and classifiers that process ultrasound, RADAR, GPS, LiDAR and
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::d056d068b288b1363022535aa3fe0fc9
http://arxiv.org/abs/2007.10115
http://arxiv.org/abs/2007.10115
Publikováno v:
ICASSP
The 3D localisation of an object and the estimation of its properties, such as shape and dimensions, are challenging under varying degrees of transparency and lighting conditions. In this paper, we propose a method for jointly localising container-li
Publikováno v:
CVPR
Deep Neural Networks have achieved extraordinary results on image classification tasks, but have been shown to be vulnerable to attacks with carefully crafted perturbations of the input data. Although most attacks usually change values of many image'
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::1944b386b35e05ad72742bec2b873f07
http://arxiv.org/abs/1811.02248
http://arxiv.org/abs/1811.02248
Autor:
George C. Polyzos, Demosthenes Akoumianakis, Panagiotis Zervas, Chrisoula Alexandraki, V. Alexiou, Panagiotis Tsakalides, V. Lalioti, Despoina Pavlidi, Apostolos Modas, Athanasios Mouchtaris, A. Eleftheriadis, Christina Anagnostopoulou, Yiannis Mastorakis, George Xylomenos
Publikováno v:
IISA
This paper presents the progress in the MusiNet research project, which aims to provide a comprehensive architecture and a prototype implementation of a Networked Music Performance (NMP) system. We describe the Musinet client and server components, a
Autor:
Badjie, Bakary1 (AUTHOR) bbadjie@ciencias.ulisboa.pt, Cecílio, José2 (AUTHOR) jmcecilio@ciencias.ulisboa.pt, Casimiro, Antonio2 (AUTHOR) casim@ciencias.ulisboa.pt
Publikováno v:
ACM Computing Surveys. Jan2025, Vol. 57 Issue 1, p1-52. 52p.