2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2018
DOI: 10.1109/iros.2018.8594090
|View full text |Cite
|
Sign up to set email alerts
|

End to End Vehicle Lateral Control Using a Single Fisheye Camera

Abstract: Convolutional neural networks are commonly used to control the steering angle for autonomous cars. Most of the time, multiple long range cameras are used to generate lateral failure cases. In this paper we present a novel model to generate this data and label augmentation using only one short range fisheye camera. We present our simulator and how it can be used as a consistent metric for lateral end-to-end control evaluation. Experiments are conducted on a custom dataset corresponding to more than 10000 km and… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
26
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 32 publications
(31 citation statements)
references
References 13 publications
0
26
0
Order By: Relevance
“…However, the work failed to extend the temporal analysis and investigate the generic metric that describes the robustness of the model. [81] proposed CNN to produce end-to-end lateral control using a single short-range fisheye camera to solve lateral control of the AV. The CNN performance was compared with average multiple trained algorithms referred to as bagging.…”
Section: A Convolutional Neural Network In Autonomous Vehicle Steerimentioning
confidence: 99%
See 3 more Smart Citations
“…However, the work failed to extend the temporal analysis and investigate the generic metric that describes the robustness of the model. [81] proposed CNN to produce end-to-end lateral control using a single short-range fisheye camera to solve lateral control of the AV. The CNN performance was compared with average multiple trained algorithms referred to as bagging.…”
Section: A Convolutional Neural Network In Autonomous Vehicle Steerimentioning
confidence: 99%
“…Table 5 shows different scenarios from various projects depicted in the AV-simulated environment. [77] Torch 7 Scientific computing framework for extensive support within the machine learning framework [2] CARSIM Provides labeled data for the training [19] TORCS 1.3.7 Used as research platform; runs on linux; used for car racing; allows users to develop their own vehicle controllers without human intervention [42] TORCS Used as research platform; runs on linux; used for car racing; human driver controls the vehicle in the simulation and enables the collection of data [79] PreScan Physics-based simulation platform utilized in the automotive industry to develop advanced driver assistance systems based on sensor technologies such as laser/LiDAR, GPS and camera [39] CarND Udacity Flexible software; allows simulation of car movement in manual and automatic mode [18] Gazebo Built using Robot operating system in Ubuntu; safety control module; evaluated performance of tracks in software before real track test [83] Udacity Allows flexible simulation environment [41] Grand Theft Auto V. (GTAV) Action adventure gaming platform for driving [81] Grand Theft Auto (GTA) Video game platform for driving [41] Carla Open source urban driving simulator; provides rich scene and numerous sensor information comprising camera, depth, LIDAR measurements, etc. [77] Operated in diverse road conditions like highways, local and residential roads in rain, sun, fog, day and night with lighting and cloudy conditions [2] Road course, traffic, wind and screen captures [19] Lane markings in tracks mimic similar to real world; not all traffic involved; other cars controlled by the games AI engine created nuisance and instigated crashes [20] Varied widespread driving conditions [42] 40-meter long course with right and left-hand curves and traffic lights [79] Valet parking scenario [39] Modeled on one-hour driving in different styles in manual mode; tried not to collide with objects and not to move out of track; allows modeling styles that can be implemented on special polygons in real world only [18] Salient features: boundaries of tracks and parts of buildings [40] Based on path following [76] Considers straight paths and daytime driving only.…”
Section: Driving Scenarios For Autonomous Vehicles In Simulated Ementioning
confidence: 99%
See 2 more Smart Citations
“…We follow the latter as it allows the system to directly extract features from the input that are relevant to the control problem, thereby facilitating suitable features to be extracted. In order to mimic expert drivers, imitation learning has been applied to a variety of driving tasks, including articulated motion [4], road following [3,5] and obstacle avoidance [6,7]. The control policy is typically trained using supervised learning, where expert reference is provided.…”
Section: Introductionmentioning
confidence: 99%