ANIMAL‐SPOT enables animal‐independent signal detection and classification using deep learning

Bergler C, Smeele SQ, Tyndel SA, Barnhill A, Ortiz ST, Kalan AK, Cheng RX, Brinkløv S, Osieka AN, Tougaard J, Jakobsen F, Wahlberg M, Nöth E, Maier A, Klump BC (2022)


Publication Language: English

Publication Type: Journal article

Publication year: 2022

Journal

Book Volume: 12

Pages Range: 1-16

Article Number: 21966

Journal Issue: 1

DOI: 10.1038/s41598-022-26429-y

Abstract

Bioacoustic research spans a wide range of biological questions and applications, relying on
identification of target species or smaller acoustic units, such as distinct call types. However, manually
identifying the signal of interest is time-intensive, error-prone, and becomes unfeasible with large
data volumes. Therefore, machine-driven algorithms are increasingly applied to various bioacoustic
signal identification challenges. Nevertheless, biologists still have major difficulties trying to transfer
existing animal- and/or scenario-related machine learning approaches to their specific animal datasets
and scientific questions. This study presents an animal-independent, open-source deep learning
framework, along with a detailed user guide. Three signal identification tasks, commonly encountered
in bioacoustics research, were investigated: (1) target signal vs. background noise detection, (2)
species classification, and (3) call type categorization. ANIMAL-SPOT successfully segmented human-
annotated target signals in data volumes representing 10 distinct animal species and 1 additional
genus, resulting in a mean test accuracy of 97.9%, together with an average area under the ROC
curve (AUC) of 95.9%, when predicting on unseen recordings. Moreover, an average segmentation
accuracy and F1-score of 95.4% was achieved on the publicly available BirdVox-Full-Night data corpus.
In addition, multi-class species and call type classification resulted in 96.6% and 92.7% accuracy on
unseen test data, as well as 95.2% and 88.4% regarding previous animal-specific machine-based
detection excerpts. Furthermore, an Unweighted Average Recall (UAR) of 89.3% outperformed the
multi-species classification baseline system of the ComParE 2021 Primate Sub-Challenge. Besides
animal independence, ANIMAL-SPOT does not rely on expert knowledge or special computing
resources, thereby making deep-learning-based bioacoustic signal identification accessible to a broad
audience.

Authors with CRIS profile

How to cite

APA:

Bergler, C., Smeele, S.Q., Tyndel, S.A., Barnhill, A., Ortiz, S.T., Kalan, A.K.,... Klump, B.C. (2022). ANIMAL‐SPOT enables animal‐independent signal detection and classification using deep learning. Scientific Reports, 12(1), 1-16. https://dx.doi.org/10.1038/s41598-022-26429-y

MLA:

Bergler, Christian, et al. "ANIMAL‐SPOT enables animal‐independent signal detection and classification using deep learning." Scientific Reports 12.1 (2022): 1-16.

BibTeX: Download