2022
DOI: 10.1016/j.measurement.2022.111671
|View full text |Cite
|
Sign up to set email alerts
|

A multiple-blockage identification scheme for buried pipeline via acoustic signature model and SqueezeNet

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 21 publications
0
2
0
Order By: Relevance
“…Federated learning mainly transmits model parameters, and the large number of parameters in existing deep learning models may cause the failure of federated learning. In 2017, a lightweight network model called SqueezeNet was proposed, with parameters nearly 1/50th the size of AlexNet, yet demonstrating similar accuracy [24]. Zhong et al further reduced model complexity and dependence on samples by introducing a selfattention integrated lightweight model combined with transfer learning [25].…”
Section: Introductionmentioning
confidence: 99%
“…Federated learning mainly transmits model parameters, and the large number of parameters in existing deep learning models may cause the failure of federated learning. In 2017, a lightweight network model called SqueezeNet was proposed, with parameters nearly 1/50th the size of AlexNet, yet demonstrating similar accuracy [24]. Zhong et al further reduced model complexity and dependence on samples by introducing a selfattention integrated lightweight model combined with transfer learning [25].…”
Section: Introductionmentioning
confidence: 99%
“…Using an existing model as a starting point reduces the time and computing resources needed to train a model from scratch. SqueezeNet, a neural network architecture designed for image recognition, contains fewer parameters than AlexNet, VGG, and ResNet [32][33][34]. The SqueezeNet framework develops models with low parameter counts and great image recognition accuracy.…”
Section: Introductionmentioning
confidence: 99%