2019
DOI: 10.3390/s19030648
|View full text |Cite
|
Sign up to set email alerts
|

A Systematic Review of Perception System and Simulators for Autonomous Vehicles Research

Abstract: This paper presents a systematic review of the perception systems and simulators for autonomous vehicles (AV). This work has been divided into three parts. In the first part, perception systems are categorized as environment perception systems and positioning estimation systems. The paper presents the physical fundamentals, principle functioning, and electromagnetic spectrum used to operate the most common sensors used in perception systems (ultrasonic, RADAR, LiDAR, cameras, IMU, GNSS, RTK, etc.). Furthermore… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
188
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 367 publications
(188 citation statements)
references
References 62 publications
0
188
0
Order By: Relevance
“…Environmental perception is achieved by acquiring and converting raw data from an on-board suite of vehicle sensors into scene understanding. Such sensors include cameras, stereovision, infrared, radar, ultrasonic (Van Brummelen et al 2018;Rosique et al 2019), and light detection and radar (LIDAR). This section therefore delineates how AVs assimilate the environment through vision and LiDAR-based sensing.…”
Section: Methodsologymentioning
confidence: 99%
See 1 more Smart Citation
“…Environmental perception is achieved by acquiring and converting raw data from an on-board suite of vehicle sensors into scene understanding. Such sensors include cameras, stereovision, infrared, radar, ultrasonic (Van Brummelen et al 2018;Rosique et al 2019), and light detection and radar (LIDAR). This section therefore delineates how AVs assimilate the environment through vision and LiDAR-based sensing.…”
Section: Methodsologymentioning
confidence: 99%
“…LIDAR significantly enhances AV perception capabilities (Schwarz 2010), as LIDAR sensors rotate at high speeds, emitting laser beams to create a sparse 3D point cloud, with each data point signifying a reflection from an objects surface (Rosique et al 2019). LIDAR is central for object detection and distance estimation but is limited with regard to object recognition.…”
Section: Methodsologymentioning
confidence: 99%
“…In addition to using video cameras as major visionary sensors, these vehicles also use other sensors for detection of different events in the car's surroundings, e.g., RADAR and LIDAR. The surrounding environment of the autonomous vehicles is perceived in two stages [74]. In the first stage, the whole road is scanned for the detection of changes in the driving conditions such as traffic signs and lights, pedestrian crossing, and other obstacles, etc.…”
Section: A Applications Of ML For the Perception Task In Cavsmentioning
confidence: 99%
“…Connected autonomous electrified vehicles (CAEVs) offer high potential to improve road safety, boost traffic efficiency and minimize carbon emissions [1], as well as reduce vehicle wear, transportation times and fuel consumption [2]. Perception systems in CAEVs [3] are the fundamental of decision making, route planning, obstacle avoidance and trajectory tracking [4], etc. As the most essential sensor of perception systems, the visual camera  Guo-Dong Yin ygd@seu.edu.cn…”
Section: Motivations and Technical Challengesmentioning
confidence: 99%