2023
DOI: 10.32604/cmes.2023.025405
|View full text |Cite
|
Sign up to set email alerts
|

STPGTN–A Multi-Branch Parameters Identification Method Considering Spatial Constraints and Transient Measurement Data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(6 citation statements)
references
References 25 publications
0
6
0
Order By: Relevance
“…Deep learning techniques have advanced rapidly due to the ongoing increase in computer technology [18][19][20][21]. The powerful feature-extraction ability and end-to-end network structure of deep learning methods can be effectively applied to the field of remotesensing image-change detection.…”
Section: Change-detection Methods In Deep Learningmentioning
confidence: 99%
“…Deep learning techniques have advanced rapidly due to the ongoing increase in computer technology [18][19][20][21]. The powerful feature-extraction ability and end-to-end network structure of deep learning methods can be effectively applied to the field of remotesensing image-change detection.…”
Section: Change-detection Methods In Deep Learningmentioning
confidence: 99%
“…For a set of transaction sample features u ∈ R C , we first group its feature channels to obtain two sets of sub-features u g ∈ R N×d and u l ∈ R d×N (N represents the number of groups, and d represents the feature dimension of each group). Then, feature computation with self-attention [35,36] is performed on two sets of sub-features separately to fully capture the global and local correlations among feature channels. Self-attention is widely used in the field of Natural Language Processing (NLP), which can effectively capture the dependencies between arbitrary features by computing the spatial distance of pairs of features.…”
Section: Proposed Modelmentioning
confidence: 99%
“…We chose to use a residual network (ResNet) as the encoder since its residual blocks help resolve issues with gradient explosion and disappearance resulting from network deepening [48]. Many researchers improved ResNet with different versions [49][50][51][52]. Merely replacing ResNet with the improved versions in our model framework can improve the experimental results.…”
Section: Network Architecturementioning
confidence: 99%