2019
DOI: 10.1121/1.5126115
|View full text |Cite
|
Sign up to set email alerts
|

Sound source ranging using a feed-forward neural network trained with fitting-based early stopping

Abstract: When a feed-forward neural network (FNN) is trained for source ranging in an ocean waveguide, it is difficult evaluating the range accuracy of the FNN on unlabeled test data. A fitting-based early stopping (FEAST) method is introduced to evaluate the range error of the FNN on test data where the distance of source is unknown. Based on FEAST, when the evaluated range error of the FNN reaches the minimum on test data, stopping training, which will help to improve the ranging accuracy of the FNN on the test data.… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 43 publications
(12 citation statements)
references
References 14 publications
0
12
0
Order By: Relevance
“…Features learning for classification uses a fully connected network of 300, 200, and 100 nodes for each layer, respectively, with a ReLU activation function and ends with a classification layer Softmax using a trial and error search technique. The training, tuned by a grid search mechanism [40], is done under the CPU with an early stopping option and batch size equal to 10 samples using Adam optimizer [43].…”
Section: Bearing Fault Detection and Diagnosismentioning
confidence: 99%
“…Features learning for classification uses a fully connected network of 300, 200, and 100 nodes for each layer, respectively, with a ReLU activation function and ends with a classification layer Softmax using a trial and error search technique. The training, tuned by a grid search mechanism [40], is done under the CPU with an early stopping option and batch size equal to 10 samples using Adam optimizer [43].…”
Section: Bearing Fault Detection and Diagnosismentioning
confidence: 99%
“…Early stopping is a method that informs when to stop running iterative algorithms during the training process, which improves the general performance of CNN models by reducing model overfitting and removing small test errors that are not visible during the training process [63,64]. The validation loss is the average model error of the validation data from a specified loss function, which tells the CNN model when to stop training.…”
Section: Model Optimizationmentioning
confidence: 99%
“…This may not yield superior performance on the test data. Chi et al (2019) prevented the classifier from being overfitted to the training data by adding regularization to the test data while training on ship range estimation. As the input, they used the covariance matrix of a vectorized sound field proposed by Niu et al (2017a).…”
Section: Passive Target Localizationmentioning
confidence: 99%