2022
DOI: 10.1049/csy2.12066
|View full text |Cite
|
Sign up to set email alerts
|

Obstacle‐transformer: A trajectory prediction network based on surrounding trajectories

Abstract: Recurrent Neural Network, Long Short-Term Memory, and Transformer have made great progress in predicting the trajectories of moving objects. Although the trajectory element with the surrounding scene features has been merged to improve performance, there still exist some problems to be solved. One is that the time series processing models will increase the inference time with the increase of the number of prediction sequences. Another problem is that the features cannot be extracted from the scene's image and … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 28 publications
0
3
0
Order By: Relevance
“…Apart from methods that use the original Transformer architecture, this chapter will also introduce methods that use an attention mechanism to predict trajectories. The Transformer architecture can be used without any changes to predict trajectories based on observed positions [17,58,59,[74][75][76].…”
Section: Transformer-and Attention-based Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…Apart from methods that use the original Transformer architecture, this chapter will also introduce methods that use an attention mechanism to predict trajectories. The Transformer architecture can be used without any changes to predict trajectories based on observed positions [17,58,59,[74][75][76].…”
Section: Transformer-and Attention-based Methodsmentioning
confidence: 99%
“…Zhang et al [58] applied two Transformers in parallel, one to infer obstacle positions from observed trajectories and one for predicting the x and y positions separately.…”
Section: Transformer-and Attention-based Methodsmentioning
confidence: 99%
See 1 more Smart Citation