Windy events detection in big bioacoustics datasets using a pre-trained Convolutional Neural Network.
Autor: | Terranova F; Department of Life Sciences and Systems Biology, University of Turin, Turin, Italy. Electronic address: francesca.terranova@unito.it., Betti L; Department of Network and Data Science, Central European University, Vienna, Austria., Ferrario V; Department of Life Sciences and Systems Biology, University of Turin, Turin, Italy; Chester Zoo, Caughall Road, Chester, UK., Friard O; Department of Life Sciences and Systems Biology, University of Turin, Turin, Italy., Ludynia K; Southern African Foundation for the Conservation of Coastal Birds (SANCCOB), Cape Town, South Africa; Department of Biodiversity and Conservation Biology, University of the Western Cape, Robert Sobukwe Road, Bellville, South Africa., Petersen GS; Southern African Foundation for the Conservation of Coastal Birds (SANCCOB), Cape Town, South Africa., Mathevon N; ENES Bioacoustics Research Lab, CRNL, University of Saint-Etienne, CNRS, Inserm, Saint-Etienne, France; Institut universitaire de France, Ministry of Higher Education, Research and Innovation, 1 rue Descartes, CEDEX 05, Paris, France; Ecole Pratique des Hautes Etudes, CHArt lab, PSL University, Paris, France., Reby D; ENES Bioacoustics Research Lab, CRNL, University of Saint-Etienne, CNRS, Inserm, Saint-Etienne, France; Institut universitaire de France, Ministry of Higher Education, Research and Innovation, 1 rue Descartes, CEDEX 05, Paris, France., Favaro L; Department of Life Sciences and Systems Biology, University of Turin, Turin, Italy; Stazione Zoologica Anton Dohrn, Naples, Italy. |
---|---|
Jazyk: | angličtina |
Zdroj: | The Science of the total environment [Sci Total Environ] 2024 Nov 01; Vol. 949, pp. 174868. Date of Electronic Publication: 2024 Jul 19. |
DOI: | 10.1016/j.scitotenv.2024.174868 |
Abstrakt: | Passive Acoustic Monitoring (PAM), which involves using autonomous record units for studying wildlife behaviour and distribution, often requires handling big acoustic datasets collected over extended periods. While these data offer invaluable insights about wildlife, their analysis can present challenges in dealing with geophonic sources. A major issue in the process of detection of target sounds is represented by wind-induced noise. This can lead to false positive detections, i.e., energy peaks due to wind gusts misclassified as biological sounds, or false negative, i.e., the wind noise masks the presence of biological sounds. Acoustic data dominated by wind noise makes the analysis of vocal activity unreliable, thus compromising the detection of target sounds and, subsequently, the interpretation of the results. Our work introduces a straightforward approach for detecting recordings affected by windy events using a pre-trained convolutional neural network. This process facilitates identifying wind-compromised data. We consider this dataset pre-processing crucial for ensuring the reliable use of PAM data. We implemented this preprocessing by leveraging YAMNet, a deep learning model for sound classification tasks. We evaluated YAMNet as-is ability to detect wind-induced noise and tested its performance in a Transfer Learning scenario by using our annotated data from the Stony Point Penguin Colony in South Africa. While the classification of YAMNet as-is achieved a precision of 0.71, and recall of 0.66, those metrics strongly improved after the training on our annotated dataset, reaching a precision of 0.91, and recall of 0.92, corresponding to a relative increment of >28 %. Our study demonstrates the promising application of YAMNet in the bioacoustics and ecoacoustics fields, addressing the need for wind-noise-free acoustic data. We released an open-access code that, combined with the efficiency and peak performance of YAMNet, can be used on standard laptops for a broad user base. Competing Interests: Declaration of competing interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper. (Copyright © 2024 The Authors. Published by Elsevier B.V. All rights reserved.) |
Databáze: | MEDLINE |
Externí odkaz: |