2020 IEEE International Conference on Real-Time Computing and Robotics (RCAR) 2020
DOI: 10.1109/rcar49640.2020.9303034
|View full text |Cite
|
Sign up to set email alerts
|

Development of task-oriented ROS-based Autonomous UGV with 3D Object Detection

Abstract: In a scenario where fire accidents takes place the priority is always human safety and acting swiftly to contain the fire from further spreading. The modern autonomous systems can promise both human safety and can perform actions rapidly. One such scenario which is motivated by urban firefighting was designed in challenge 3 of MBZIRC 2020 competition. In this challenge the UAV's and UGV collaborate autonomously to detect the fire and quench the flames with water. So, in this project we have developed Robot Ope… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 12 publications
0
5
0
Order By: Relevance
“…Object detectors in this specific area are the You Only Look Once model (YOLO) and the Single Shot MultiBox Detector (SSD). Three applications used the YOLOv3 [48][49][50]. YOLOv3 shows great adaptability achieving very high performance.…”
Section: Software/methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…Object detectors in this specific area are the You Only Look Once model (YOLO) and the Single Shot MultiBox Detector (SSD). Three applications used the YOLOv3 [48][49][50]. YOLOv3 shows great adaptability achieving very high performance.…”
Section: Software/methodsmentioning
confidence: 99%
“…In addition, the DroneDeploy [54] and the PIX4D [27] as drone mapping software were used to assist the fire detection procedure. Finally, the open-source Robot Operating System (ROS) [19,49] for drone navigation as well as the Node-Red [29] for programming event-driven applications were also selected.…”
Section: Software/methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Additional color information is used to detect and localize the bricks. A UGV for Challenge 3 was developed by (Raveendran et al, 2020). Again, a 3D LiDAR sensor is used for SLAM.…”
Section: Related Workmentioning
confidence: 99%
“…Deep learning object detection algorithms such as You Only Look Once (YOLO), Single Shot Detector (SSD), Faster R-CNN etc have been used in many robotic applications. In [4,5,6,7] object detection using pre-trained Convolutional Neural Networks (CNNs) integrated with visual SLAM to enhance the robotic capabilities. In this paper, three commonly used Lidar-based SLAM algorithms are evaluated in simulation environment.…”
Section: Introductionmentioning
confidence: 99%