An integrated framework of mobile crowd estimation for the 2019, July 1st rally in Hong Kong.

Autor: Chow TE; Department of Geography and Environmental Studies, Texas State University, San Marcos, TX 78666 USA., Yip PSF; Department of Social Work and Social Administration, University of Hong Kong, Hong Kong SAR, China., Wong KP; C&R Wise AI Limited, Cambridge, UK.
Jazyk: angličtina
Zdroj: Multimedia tools and applications [Multimed Tools Appl] 2023 May 02, pp. 1-18. Date of Electronic Publication: 2023 May 02.
DOI: 10.1007/s11042-023-15417-7
Abstrakt: Traditional approach of mobile crowd estimation involves counting a group of individuals at a specific place, manually, in real-time. It is a laborious exercise that can be physically and mentally demanding. In Hong Kong, a large rally can last more than six hours, making the manual count method susceptible to human errors. While crowd counting using object detection and tracking has been well-established in computer vision, such application has remained relatively small scale within a controlled indoor setting (e.g. counting people at fixed gateways in a mall). No attempt to date has applied the automatic crowd counting method to count hundreds of thousands of people along an open stretch of rally route within the complex urban outdoor landscape. This research proposed an integrated approach that combines the capture-recapture method in statistics and a Convolutional Neural Network (CNN) method in computer vision to count the mobile crowd. The research teams implemented the integrative approach and counted 276,970 people with a 95% confidence interval of 263,663 to 290,276 in the 2019, July 1 st Rally in Hong Kong. This work counted the attendance of a large-scale rally as a proof of concept to fill in a gap in the empirical studies. The intellectual merits and research findings shed useful insights to improve mobile population estimation and leverage alternative data sources to support related scientific applications.
Competing Interests: Conflicts of interestThe authors have no conflicts of interest in this work.
(© The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2023, Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.)
Databáze: MEDLINE