2023
DOI: 10.3390/jmse11061108
|View full text |Cite
|
Sign up to set email alerts
|

GIT: A Transformer-Based Deep Learning Model for Geoacoustic Inversion

Abstract: Geoacoustic inversion is a challenging task in marine research due to the complex environment and acoustic propagation mechanisms. With the rapid development of deep learning, various designs of neural networks have been proposed to solve this issue with satisfactory results. As a data-driven method, deep learning networks aim to approximate the inverse function of acoustic propagation by extracting knowledge from multiple replicas, outperforming conventional inversion methods. However, existing deep learning … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
0
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 25 publications
0
0
0
Order By: Relevance
“…Table 2 illustrates the distribution of published articles focusing on the application of the transformer architecture and the attention mechanism for genome data, across a variety of scientific journals. (20), constituting 16.1% of the total studies in this domain. The 'Bioinformatics', 'BMC Bioinformatics', and 'Frontiers in Genetics' journals follow closely, each contributing 7.3% of the total publications.…”
Section: Journals Of Published Papersmentioning
confidence: 99%
See 1 more Smart Citation
“…Table 2 illustrates the distribution of published articles focusing on the application of the transformer architecture and the attention mechanism for genome data, across a variety of scientific journals. (20), constituting 16.1% of the total studies in this domain. The 'Bioinformatics', 'BMC Bioinformatics', and 'Frontiers in Genetics' journals follow closely, each contributing 7.3% of the total publications.…”
Section: Journals Of Published Papersmentioning
confidence: 99%
“…Inspired by the success of attention mechanisms, the transformer model was proposed as a complete shift from the sequential processing nature of recurrent neural networks (RNNs) and their variants [19][20][21][22]. The transformer model leverages attention mechanisms to process the input data in parallel, allowing for faster and more efficient computations.…”
Section: Introductionmentioning
confidence: 99%