Zobrazeno 1 - 10
of 533
pro vyhledávání: '"Kanellakis P"'
Autor:
Viswanathan, Vignesh Kottayam, Sumathy, Vidya, Kanellakis, Christoforos, Nikolakopoulos, George
In this work, we present an autonomous inspection framework for remote sensing tasks in active open-pit mines. Specifically, the contributions are focused towards developing a methodology where an initial approximate operator-defined inspection plan
Externí odkaz:
http://arxiv.org/abs/2410.10256
Recent advances in robotics are pushing real-world autonomy, enabling robots to perform long-term and large-scale missions. A crucial component for successful missions is the incorporation of loop closures through place recognition, which effectively
Externí odkaz:
http://arxiv.org/abs/2410.02643
Autor:
Saucedo, Mario Alberto Valdes, Stathoulopoulos, Nikolaos, Patel, Akash, Kanellakis, Christoforos, Nikolakopoulos, George
This article studies the commonsense object affordance concept for enabling close-to-human task planning and task optimization of embodied robotic agents in urban environments. The focus of the object affordance is on reasoning how to effectively ide
Externí odkaz:
http://arxiv.org/abs/2409.05392
Autor:
Saucedo, Mario A. V., Stathoulopoulos, Nikolaos, Sumathy, Vidya, Kanellakis, Christoforos, Nikolakopoulos, George
Object detection and global localization play a crucial role in robotics, spanning across a great spectrum of applications from autonomous cars to multi-layered 3D Scene Graphs for semantic scene understanding. This article proposes BOX3D, a novel mu
Externí odkaz:
http://arxiv.org/abs/2408.14941
Autor:
Saucedo, Mario A. V., Patel, Akash, Saradagi, Akshit, Kanellakis, Christoforos, Nikolakopoulos, George
In this article, we propose the novel concept of Belief Scene Graphs, which are utility-driven extensions of partial 3D scene graphs, that enable efficient high-level task planning with partial information. We propose a graph-based learning methodolo
Externí odkaz:
http://arxiv.org/abs/2402.03840
In this article, we propose a novel navigation framework that leverages a two layered graph representation of the environment for efficient large-scale exploration, while it integrates a novel uncertainty awareness scheme to handle dynamic scene chan
Externí odkaz:
http://arxiv.org/abs/2402.02566
Autor:
Song, Meiyue, Yu, Zhihua, Wang, Jiaxin, Wang, Jiarui, Lu, Yuting, Li, Baicun, Wang, Xiaoxu, Huang, Qinghua, Li, Zhijun, Kanellakis, Nikolaos I., Liu, Jiangfeng, Wang, Jing, Wang, Binglu, Yang, Juntao
The conventional pretraining-and-finetuning paradigm, while effective for common diseases with ample data, faces challenges in diagnosing data-scarce occupational diseases like pneumoconiosis. Recently, large language models (LLMs) have exhibits unpr
Externí odkaz:
http://arxiv.org/abs/2312.03490
Autor:
Claire Gallagher, George Moschonis, Katrina Lambert, Spyridon Kanellakis, Eva Karaglani, Niki Mourouti, Costas Anastasiou, Bircan Erbas, Yannis Manios
Publikováno v:
BMC Public Health, Vol 24, Iss 1, Pp 1-10 (2024)
Abstract Background Common mental disorders often emerge during childhood and adolescence, and their prevalence is disproportionately elevated among those affected by obesity. Early life growth patterns may provide a useful target for primordial prev
Externí odkaz:
https://doaj.org/article/4b742ec0876f47f78fa65463bc149781
Autor:
Saucedo, Mario A. V., Patel, Akash, Sawlekar, Rucha, Saradagi, Akshit, Kanellakis, Christoforos, Agha-Mohammadi, Ali-Akbar, Nikolakopoulos, George
In this article, we propose a novel LiDAR and event camera fusion modality for subterranean (SubT) environments for fast and precise object and human detection in a wide variety of adverse lighting conditions, such as low or no light, high-contrast z
Externí odkaz:
http://arxiv.org/abs/2304.08908
In this paper, we study the multi-robot task assignment and path-finding problem (MRTAPF), where a number of agents are required to visit all given goal locations while avoiding collisions with each other. We propose a novel two-layer algorithm SA-re
Externí odkaz:
http://arxiv.org/abs/2304.02418