1 |
Lorenz C, Hao X, Tomka T, Rüttimann L, Hahnloser RH. Interactive extraction of diverse vocal units from a planar embedding without the need for prior sound segmentation. Front Bioinform 2023;2. [DOI: 10.3389/fbinf.2022.966066] [Reference Citation Analysis]
|
2 |
Michaud F, Sueur J, Le Cesne M, Haupert S. Unsupervised classification to improve the quality of a bird song recording dataset. Ecological Informatics 2022. [DOI: 10.1016/j.ecoinf.2022.101952] [Reference Citation Analysis]
|
3 |
Abayomi-alli OO, Damaševičius R, Qazi A, Adedoyin-olowe M, Misra S. Data Augmentation and Deep Learning Methods in Sound Classification: A Systematic Review. Electronics 2022;11:3795. [DOI: 10.3390/electronics11223795] [Reference Citation Analysis]
|
4 |
Gomez-Morales DA, Acevedo-Charry O. Satellite remote sensing of environmental variables can predict acoustic activity of an orthopteran assemblage. PeerJ 2022;10:e13969. [PMID: 36071828 DOI: 10.7717/peerj.13969] [Reference Citation Analysis]
|
5 |
Raab T, Madhav MS, Jayakumar RP, Henninger J, Cowan NJ, Benda J. Advances in non-invasive tracking of wave-type electric fish in natural and laboratory settings. Front Integr Neurosci 2022;16:965211. [DOI: 10.3389/fnint.2022.965211] [Reference Citation Analysis]
|
6 |
Stoumpou V, Vargas CDM, Schade PF, Boyd JL, Giannakopoulos T, Jarvis ED. Analysis of Mouse Vocal Communication (AMVOC): a deep, unsupervised method for rapid detection, analysis and classification of ultrasonic vocalisations. Bioacoustics. [DOI: 10.1080/09524622.2022.2099973] [Reference Citation Analysis]
|
7 |
Arnaud V, Pellegrino F, Keenan S, St-gelais X, Mathevon N, Levréro F, Coupé C. Improving the workflow to crack Small, Unbalanced, Noisy, but Genuine (SUNG) datasets in bioacoustics: the case of bonobo calls.. [DOI: 10.1101/2022.06.26.497684] [Reference Citation Analysis]
|
8 |
Raab T, Madhav MS, Jayakumar RP, Henninger J, Cowan NJ, Benda J. Advances in non-invasive tracking of wave-type electric fish in natural and laboratory settings.. [DOI: 10.1101/2022.06.02.494479] [Reference Citation Analysis]
|
9 |
Vijendravarma RK, Narasimha S, Steinfath E, Clemens J, Leopold P. Drosophila females have an acoustic preference for symmetric males. Proc Natl Acad Sci U S A 2022;119:e2116136119. [PMID: 35312357 DOI: 10.1073/pnas.2116136119] [Cited by in Crossref: 1] [Cited by in F6Publishing: 1] [Article Influence: 1.0] [Reference Citation Analysis]
|
10 |
Stowell D. Computational bioacoustics with deep learning: a review and roadmap. PeerJ 2022;10:e13152. [PMID: 35341043 DOI: 10.7717/peerj.13152] [Cited by in Crossref: 29] [Cited by in F6Publishing: 26] [Article Influence: 29.0] [Reference Citation Analysis]
|