2016 IEEE 19th International Conference on Intelligent Transportation Systems (ITSC) 2016
DOI: 10.1109/itsc.2016.7795620
|View full text |Cite
|
Sign up to set email alerts
|

Sensor scan timing compensation in environment models for automated road vehicles

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 10 publications
(6 citation statements)
references
References 17 publications
0
6
0
Order By: Relevance
“…35 If the time step between two scans is sufficiently low (e.g. 0.1 s), 36 changes to the surrounding are already comprehensive of drivers’ intervention: 37 in this case, it is not necessary to foresee the scenario evolution making use of complex models for driver behaviour. 10…”
Section: Methodsmentioning
confidence: 99%
“…35 If the time step between two scans is sufficiently low (e.g. 0.1 s), 36 changes to the surrounding are already comprehensive of drivers’ intervention: 37 in this case, it is not necessary to foresee the scenario evolution making use of complex models for driver behaviour. 10…”
Section: Methodsmentioning
confidence: 99%
“…Unlike other approaches, which compensate for ego motion by transforming all measurements to a given point in time (e. g. [13], [20], [38], [39]), we propagate the time information through the preprocessing steps. In contrast to the references mentioned above, this allows a proper consideration of ego and target motion, as we have discussed in Rieken and Maurer [9].…”
Section: B Scan Definition and Timing Behaviormentioning
confidence: 99%
“…Based on previous results ( [3], [9]), the following section focuses on the integration of the aforementioned object model extensions. The results will be explained along the basic processing steps of recursive state estimation techniques.…”
Section: Object Trackingmentioning
confidence: 99%
See 1 more Smart Citation
“…Examples presented in Figure 1 illustrate misalignment of lidar and visual features in the environment when projecting uncorrected lidar points to images, which can cause degraded performance in the sensor fusion. Interested readers can refer to [2] for quantitative analyses of the time-related effects of moving scanning sensors on different perception tasks for multiple sensor systems.…”
Section: Introductionmentioning
confidence: 99%