MTHI‐former: Multilevel attention for two‐handed reconstruction from RGB image
Autor: | Zixun Jiao, Xihan Wang, Jingcao Li, Quanli Gao |
---|---|
Jazyk: | angličtina |
Rok vydání: | 2023 |
Předmět: | |
Zdroj: | Electronics Letters, Vol 59, Iss 23, Pp n/a-n/a (2023) |
Druh dokumentu: | article |
ISSN: | 1350-911X 0013-5194 |
DOI: | 10.1049/ell2.13040 |
Popis: | Abstract Hand reconstruction is the foundation of virtual reality and human–computer interaction, but currently it still faces challenges such as blurred interaction edge and inter‐hand occlusion. To solve these challenges, in this letter, the authors propose a framework called Multilevel Two Hand Interactive Former (MTHI‐Former) which considers the vertices of a hand model as graph structures and learn the connectivity relationships information between vertices. In this framework, the authors introduce two novel modules. The first is the Multi‐branch Image Feature Extraction Module , which is utilized to obtain accurate hand features. The second is the Multilevel Two Hand Interaction Module , which is utilized to fuse interactive hand information to enhance the attention features of interactive edges, and fuse vertex relationships to determine the occlusion relationship between interactive hands. The authors compare their method with recent methods on the InterHand 2.6M dataset, and the experimental results show that their method outperforms other representative methods, achieving a result of 13.15 mm and an improvement in performance of about 18%. |
Databáze: | Directory of Open Access Journals |
Externí odkaz: |