In a scenario where fire accidents takes place the priority is always human safety and acting swiftly to contain the fire from further spreading. The modern autonomous systems can promise both human safety and can perform actions rapidly. One such scenario which is motivated by urban firefighting was designed in challenge 3 of MBZIRC 2020 competition. In this challenge the UAV's and UGV collaborate autonomously to detect the fire and quench the flames with water. So, in this project we have developed Robot Operating System (ROS)-based autonomous system to solve the challenge for UGV criteria by detecting targeted objects in real-time, in our case its simulated fire and red colored softballs. Then finally localize those targets as markers in the map and navigate autonomously to all those targets. This work has two sections, in the first section mapping and localizing the fire and softballs in highly cluttered environment and then reaching those targets autonomously. Robustly mapping the area with adequate sensors and detection of targets with optimally trained CNN based network is the key to localizing of the targeted objects in a highly cluttered environments.
Semantic segmentation directly from the images of landfills can be utilized in the earth movers to segregate the garbage autonomously. Generally, Various segregation methods are available for garbage segregation such as IOT based waste segregation, Conveyor belt segregation in which none of them are directly from landfills. Semantic segmentation is one of the important tasks that maps the path towards the complete scene understanding. The aim of this paper is to present a smart segregation method for garbage by using semantic segmentation with DeepLab V3+ Model using the framework(Backbone model) of Xception-65 with the mean accuracy of 75.01%. This paper features the segmentation with the GarbotV1 dataset which has major classifications such as Plastic, Cart-board, Wood, Metal, Sponge. The paper also contributes a method for reconstructing the segmented images to build a 3D map and this exploits the use of earth moving vehicles to navigate autonomously by localizing the segmented objects.
In this paper, we implement a quadcopter assembly with control and navigation module. The project also includes the design of the control panel for the operator which consists of a set of the microcontroller and the glove equipped with sensors and buttons. The panel has a touch screen which displays current parameters such as vehicle status, including information about orientation and geographical coordinates. The concept of quadcopter control is based on the movement of the operator hand. In addition, we have included the object detection for detecting the objects from the quadcopter view of point. To detect an object, we need to have some idea of where the object may be and how the image is divided into segments. It creates a kind of chicken and egg problem, where we must recognize the shape (and class) of the object knowing its location and recognize the location of the object knowing its shape. Some visual characteristics such as clothing and the human face, they can be part of the same subject, but it is difficult to recognize this without recognizing the object first.
Lightweight, affordable spherical cameras can be utilized in mobile robotics to build a 3D map of the working environment of a robot. This paper demonstrates how to use the optical flow between two spherical images to quickly construct a semantically meaningful 3D model spanning all directions. The main contribution of the paper is the use of semantic segmentation to increase the robustness of the reconstruction method in both indoor and outdoor applications.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.