2022
DOI: 10.1007/978-3-031-24386-8_22
|View full text |Cite
|
Sign up to set email alerts
|

ITAR: A Method for Indoor RFID Trajectory Automatic Recovery

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(4 citation statements)
references
References 25 publications
0
4
0
Order By: Relevance
“…This data may then be utilized to forecast a user's position based on sensor readings [165]- [167]. The authors [168] suggest an Indoor Trajectory employing a sequence-to-sequence learning architecture, a generation trajectory using a graph neural network, and a multi-head attention mechanism to capture correlations among trajectory points to increase performance. In this study the authors [169] proposed an effective hybrid technique for local feature matching and transformer deployment using a graph neural network.…”
Section: Heterogeneous Graphsmentioning
confidence: 99%
See 1 more Smart Citation
“…This data may then be utilized to forecast a user's position based on sensor readings [165]- [167]. The authors [168] suggest an Indoor Trajectory employing a sequence-to-sequence learning architecture, a generation trajectory using a graph neural network, and a multi-head attention mechanism to capture correlations among trajectory points to increase performance. In this study the authors [169] proposed an effective hybrid technique for local feature matching and transformer deployment using a graph neural network.…”
Section: Heterogeneous Graphsmentioning
confidence: 99%
“…This can lead to more accurate and robust predictions, as well as increased model interpretability. [168], [179]- [181].…”
Section: ) Attention Mechanismsmentioning
confidence: 99%
“…This data may then be utilized to forecast a user's position based on sensor readings [152][153][154]. The authors [155] suggest an Indoor Trajectory employing a sequence-to-sequence learning architecture, a generation trajectory using a graph neural network, and a multi-head attention mechanism to capture correlations among trajectory points to increase performance. The this study the authors [156] proposed an effective hybrid technique for local feature matching and transformer deployment using a graph neural network.…”
Section: Graphical Neuronal Network (Gnn)mentioning
confidence: 99%
“…These attention weights describe the relative relevance of each input element and are used to weight each element's contribution to the output.In general, allows a DL model to dynamically focus on the most important information in the input data while disregarding less relevant information. This can lead to more accurate and robust predictions, as well as increased model interpretability [155,[166][167][168].…”
Section: Attention Mechanismsmentioning
confidence: 99%