Zobrazeno 1 - 10
of 15
pro vyhledávání: '"Dian Shao"'
Autor:
Yuanming Ye, Haochao Wang, Yanqiu Tian, Kunpeng Gao, Minghao Wang, Xuanqi Wang, Zekai Liang, Xiaoli You, Shan Gao, Dian Shao, Bowen Ji
Publikováno v:
Nanotechnology and Precision Engineering, Vol 6, Iss 4, Pp 045001-045001-23 (2023)
Epidermal electrophysiological monitoring has garnered significant attention for its potential in medical diagnosis and healthcare, particularly in continuous signal recording. However, simultaneously satisfying skin compliance, mechanical properties
Externí odkaz:
https://doaj.org/article/f93e0e41d56e4db1bdd2e43f8af894ee
Publikováno v:
Micromachines, Vol 14, Iss 5, p 976 (2023)
In this paper, we propose a classification algorithm of EEG signal based on canonical correlation analysis (CCA) and integrated with adaptive filtering. It can enhance the detection of steady-state visual evoked potentials (SSVEPs) in a brain–compu
Externí odkaz:
https://doaj.org/article/23a833eabe8f49f494d72dddc0937291
Autor:
Dian Shao, Weiting Xiong
Publikováno v:
Sustainability; Volume 14; Issue 10; Pages: 5776
Numerous studies have suggested a positive correlation between spatial and population densities. However, few have systematically conducted quantitative analysis and deciphered the detailed correlation in block scale. Here, we construct a population
Publikováno v:
Sustainability; Volume 14; Issue 10; Pages: 6204
The urban construction land change is the most obvious and complex spatial phenomenon in urban agglomerations which has attracted extensive attention of scholars in different fields. Yangtze River Delta Urban Agglomeration is the most mature urban ag
Publikováno v:
CVPR
On public benchmarks, current action recognition techniques have achieved great success. However, when used in real-world applications, e.g. sport analysis, which requires the capability of parsing an activity into phases and differentiating between
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::350c6c860482b2571125567261ea64e2
http://arxiv.org/abs/2004.06704
http://arxiv.org/abs/2004.06704
Recent years, human-object interaction (HOI) detection has achieved impressive advances. However, conventional two-stage methods are usually slow in inference. On the other hand, existing one-stage methods mainly focus on the union regions of interac
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::8dcbe1eb47bc38ba60909bcd39e7ecf6
Publikováno v:
CVPR
Current methods for action recognition primarily rely on deep convolutional networks to derive feature embeddings of visual and motion features. While these methods have demonstrated remarkable performance on standard benchmarks, we are still in need
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::0b8c681e47ed17a2e4250c55a26ff5ef
Publikováno v:
Computer Vision – ECCV 2018 ISBN: 9783030012397
ECCV (9)
ECCV (9)
The thriving of video sharing services brings new challenges to video retrieval, e.g. the rapid growth in video duration and content diversity. Meeting such challenges calls for new techniques that can effectively retrieve videos with natural languag
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_________::1f83593ac9a4abd46d8debc062a7c83c
https://doi.org/10.1007/978-3-030-01240-3_13
https://doi.org/10.1007/978-3-030-01240-3_13
Publikováno v:
Computer Vision – ACCV 2016 Workshops ISBN: 9783319544267
ACCV Workshops (2)
ACCV Workshops (2)
Located in China’s ancient capital Luoyang, Longmen Grottoes are one of the finest examples of Buddhist stone carving art. Nowadays, many caves do not have public access due to heritage preservation. In order to let people appreciate these relics,
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_________::3a511c452beb67cf6bb8eea4e89460e8
https://doi.org/10.1007/978-3-319-54427-4_15
https://doi.org/10.1007/978-3-319-54427-4_15
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.