2020
DOI: 10.1109/lra.2020.2969925
|View full text |Cite
|
Sign up to set email alerts
|

What the Constant Velocity Model Can Teach Us About Pedestrian Motion Prediction

Abstract: Pedestrian motion prediction is a fundamental task for autonomous robots and vehicles to operate safely. In recent years many complex approaches based on neural networks have been proposed to address this problem. In this work we show that -surprisingly -a simple Constant Velocity Model can outperform even state-of-the-art neural models. This indicates that either neural networks are not able to make use of the additional information they are provided with, or that this information is not as relevant as common… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

10
136
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 207 publications
(147 citation statements)
references
References 30 publications
(73 reference statements)
10
136
1
Order By: Relevance
“…Despite the extensive work done to leverage deep recurrent models for trajectory forecasting, a simple approach that makes future trajectory predictions based on the velocity is considered a powerful approach. Following [43], we implemented a simple baseline that used the velocity at the last two frames of input sequences to linearly extrapolate future trajectories. • Social LSTM (S-LSTM) [15] is one of the most popular baselines used in many trajectory forecasting papers.…”
Section: E Baseline Methodsmentioning
confidence: 99%
“…Despite the extensive work done to leverage deep recurrent models for trajectory forecasting, a simple approach that makes future trajectory predictions based on the velocity is considered a powerful approach. Following [43], we implemented a simple baseline that used the velocity at the last two frames of input sequences to linearly extrapolate future trajectories. • Social LSTM (S-LSTM) [15] is one of the most popular baselines used in many trajectory forecasting papers.…”
Section: E Baseline Methodsmentioning
confidence: 99%
“…Several recently introduced metrics follow a sampling approach to evaluate a probability distribution. The minimum average displacement error (mADE) metric (Rhinehart et al, 2019; Schöller et al, 2019; Tang and Salakhutdinov, 2019; Thiede and Brahma, 2019; Walker et al, 2016), as well as variety loss, oracle, minimum over N , best-of- N , top n% , or minimum mean squared distance (minMSD), computes Euclidean distance between the ground-truth position of the agent s t * at time t and the closest (or the n % closest) of the K samples from the predicted probability distribution: min k s t * s t k . Similarly, minimum final displacement error (mFDE) evaluates only the distribution at the prediction horizon T .…”
Section: Motion Prediction Evaluationmentioning
confidence: 99%
“…Several authors report a separate accuracy measurement for the more challenging (e.g., non-linear or anomalous) part of the test set (Fernando et al, 2018; Huynh and Alaghband, 2019; Kooij et al, 2019), or evaluate the model’s performance on different classes of behavior, e.g., walking or stopping (Saleh et al, 2018b). Analysis of generalization, overfitting, and input utilization by a neural network, presented by Schöller et al (2019), makes a good case for robustness evaluation.…”
Section: Motion Prediction Evaluationmentioning
confidence: 99%
“…To achieve real-time performance, their method uses parallelizable convolutional layers and incorporates no social or scene information. Another example is the method of Schöller et al [26] where the authors revisit and use the simple constant velocity model to predict the relative displacements between consecutive location points. In our previous work [27], apart from the velocity information computed directly from the input trajectories, our joint Location Velocity Attention LSTM based network (LVA) also requires no neighbourhood or scene information.…”
Section: Introductionmentioning
confidence: 99%