Automated detection of gibbon calls from passive acoustic monitoring data using convolutional neural networks in the 'torch for R' ecosystem

Autor: Clink, Dena J., Kim, Jinsung, Cross-Jaya, Hope, Ahmad, Abdul Hamid, Hong, Moeurk, Sala, Roeun, Birot, Hélène, Agger, Cain, Vu, Thinh Tien, Thi, Hoa Nguyen, Chi, Thanh Nguyen, Klinck, Holger
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: Automated detection of acoustic signals is crucial for effective monitoring of vocal animals and their habitats across ecologically-relevant spatial and temporal scales. Recent advances in deep learning have made these approaches more accessible. However, there are few deep learning approaches that can be implemented natively in the R programming environment; approaches that run natively in R may be more accessible for ecologists. The "torch for R" ecosystem has made the use of transfer learning with convolutional neural networks accessible for R users. Here, we evaluate a workflow that uses transfer learning for the automated detection of acoustic signals from passive acoustic monitoring (PAM) data. Our specific goals include: 1) present a method for automated detection of gibbon calls from PAM data using the "torch for R" ecosystem; 2) compare the results of transfer learning for six pretrained CNN architectures; and 3) investigate how well the different architectures perform on datasets of the female calls from two different gibbon species: the northern grey gibbon (Hylobates funereus) and the southern yellow-cheeked crested gibbon (Nomascus gabriellae). We found that the highest performing architecture depended on the test dataset. We successfully deployed the top performing model for each gibbon species to investigate spatial of variation in gibbon calling behavior across two grids of autonomous recording units in Danum Valley Conservation Area, Malaysia and Keo Seima Wildlife Sanctuary, Cambodia. The fields of deep learning and automated detection are rapidly evolving, and we provide the methods and datasets as benchmarks for future work.
Databáze: arXiv