2023
DOI: 10.1038/s41467-023-41693-w
|View full text |Cite|
|
Sign up to set email alerts
|

Soundscapes and deep learning enable tracking biodiversity recovery in tropical forests

Jörg Müller,
Oliver Mitesser,
H. Martin Schaefer
et al.

Abstract: Tropical forest recovery is fundamental to addressing the intertwined climate and biodiversity loss crises. While regenerating trees sequester carbon relatively quickly, the pace of biodiversity recovery remains contentious. Here, we use bioacoustics and metabarcoding to measure forest recovery post-agriculture in a global biodiversity hotspot in Ecuador. We show that the community composition, and not species richness, of vocalizing vertebrates identified by experts reflects the restoration gradient. Two auto… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
16
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 26 publications
(25 citation statements)
references
References 82 publications
0
16
0
Order By: Relevance
“…Canopy cover was positively associated with cricket acoustic activity but not with katydids. Our findings contribute to our understanding of the role of time, habitat and vegetation structure in shaping insect acoustic activity within the hyper-diverse habitats of the Amazon using a non-invasive, scalable and semi-automated approach [9]. The manual annotation process allowed us to verify the absence of other (non-target) vocally active taxa during the night within the selected frequencies, and therefore reduce their influence when analysing changes in AEI values.…”
Section: Discussionmentioning
confidence: 93%
See 2 more Smart Citations
“…Canopy cover was positively associated with cricket acoustic activity but not with katydids. Our findings contribute to our understanding of the role of time, habitat and vegetation structure in shaping insect acoustic activity within the hyper-diverse habitats of the Amazon using a non-invasive, scalable and semi-automated approach [9]. The manual annotation process allowed us to verify the absence of other (non-target) vocally active taxa during the night within the selected frequencies, and therefore reduce their influence when analysing changes in AEI values.…”
Section: Discussionmentioning
confidence: 93%
“…A diverse range of insects produce sounds and therefore PAM enables a scalable and standardized methodology for monitoring soniferous insects, which dominate nocturnal soundscapes in the tropical region [8,9]. Indeed, insects can be acoustically detected far away from the recorders (some loud species up to 100 m), which is a much longer distance than insect detection range using pitfalls, nets or cameras [3,10,11].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Similarly, Müller et al. (2023) used acoustic indices and deep learning methods to monitor biodiversity recovery after agricultural abandonment in an Ecuadorian rainforest. Though acoustic indices may not well reflect biodiversity, they nevertheless reflect biodiversity dynamics in a range of systems.…”
Section: Discussionmentioning
confidence: 99%
“…Of the few studies that have used acoustic monitoring to capture storms or extreme events, most focused on marine soundscapes (Boyd et al, 2021;Locascio & Mann, 2005;Simmons et al, 2021), though Gottesman et al (2021) recently showed that terrestrial soundscapes were less resistant than those of coral reefs to hurricane disturbance. Embedded within terrestrial soundscapes, bird vocalizations provide the opportunity to assess the impact of typhoons on critical indicator taxa (Gasc et al, 2017), while acoustic indices provide rapid information on a combination of biodiversity and other meaningful aspects of soundscape change (Bradfer-Lawrence et al, 2020;Harris et al, 2016;Müller et al, 2023;Rajan et al, 2022;Sethi et al, 2023). There are, however, few studies that simultaneously assess both individual species vocalizations and acoustic indices explicitly (Ferreira et al, 2018;Ross et al, 2018).…”
Section: Introductionmentioning
confidence: 99%