2014 IEEE Intelligent Vehicles Symposium Proceedings 2014
DOI: 10.1109/ivs.2014.6856498
|View full text |Cite
|
Sign up to set email alerts
|

Pedestrian path prediction using body language traits

Abstract: Driver Assistance Systems have achieved a high level of maturity in the latest years. As an example of that, sophisticated pedestrian protection systems are already available in a number of commercial vehicles from several OEMs. However, accurate pedestrian path prediction is needed in order to go a step further in terms of safety and reliability, since it can make the difference between effective and non-effective intervention. In this paper, we consider the three-dimensional pedestrian body language in order… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
34
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 43 publications
(35 citation statements)
references
References 16 publications
1
34
0
Order By: Relevance
“…The Bayesian Network used multivariate Gaussian distributions for the relative position and velocity with respect to the vehicle for a predicted time. There has also been research undertaken using body features that can aid in predicting a pedestrian's future behaviour such as walking, standing, bending, jogging and running [188,209,210]. The head orientation of the pedestrian was considered in [183,203].…”
Section: Cnn and Svmmentioning
confidence: 99%
“…The Bayesian Network used multivariate Gaussian distributions for the relative position and velocity with respect to the vehicle for a predicted time. There has also been research undertaken using body features that can aid in predicting a pedestrian's future behaviour such as walking, standing, bending, jogging and running [188,209,210]. The head orientation of the pedestrian was considered in [183,203].…”
Section: Cnn and Svmmentioning
confidence: 99%
“…motion state such as position and possibly velocity (e.g., Bennewitz et al, 2005; Bera et al, 2016; Elfring et al, 2014; Ferrer and Sanfeliu, 2014; Karasev et al, 2016; Kitani et al, 2012; Kooij et al, 2019; Kucner et al, 2017; Kuderer et al, 2012; Pellegrini et al, 2009; Trautman and Krause, 2010; Ziebart et al, 2009); 1.2. articulated pose such as head orientation (e.g., Hasan et al, 2018; Kooij et al, 2019, 2014; Roth et al, 2016; Unhelkar et al, 2015) or fullbody pose (Mínguez et al, 2018; Quintero et al, 2014); 1.3. semantic attributes such as the age and gender (Ma et al, 2017), personality (Bera et al, 2017), and awareness of the robot’s presence (Kooij et al, 2019; Oli et al, 2013).…”
Section: Taxonomymentioning
confidence: 99%
“…1.2. articulated pose such as head orientation (e.g., Hasan et al, 2018; Kooij et al, 2019, 2014; Roth et al, 2016; Unhelkar et al, 2015) or fullbody pose (Mínguez et al, 2018; Quintero et al, 2014);…”
Section: Taxonomymentioning
confidence: 99%
“…In particular, [3] address this problem by adding pedestrian's head orientation as an intention feature to assess the criticality of situations. Other features that may be used are the human pose [17], [18] and optical flow [16] to determine the state of a pedestrian.…”
Section: Related Workmentioning
confidence: 99%