positioning, smart trash can robot, intelligent path planning People have become busier in the modern world with both work and housework. In addition to the different types of robotic cleaner, it would be quite helpful to have a smart robotic trash collection and dumping system that provides on-call services so that a user does not need to physically get up and put trash into a trash can. As the bin reaches its maximum capacity, it also dumps the trash automatically without the user's instructions. Available smart trash cans are usually focused on determining if the trash is recyclable. Most trash cans also do not have the capability to move autonomously. In this research, we utilize fingerprint mapping for wireless indoor positioning towards the implementation of an autonomous vehicle with a mounted trash can. The user may make an indoor call to the trash-can-mounted vehicle via a mobile application under an IoT and cloud computing environment. The vehicle can position itself in front of the user for trash collection through automatic obstacle avoidance navigation with smart path planning from deep learning. The system can also monitor the amount of accumulated trash and dump trash at a fixed location before returning to the start point. This research on a smart trash-collecting robot can provide significant assistance to people who are busy, those with impaired or limited movements, and the elderly.
This study aims to use two unmanned vehicles (aerial vehicles and ground vehicles) to implement multi-machine cooperation to complete the assigned tasks quickly. Unmanned aerial/ground vehicles can call each other to send instant inquiry messages using the proposed cooperative communication protocol to hand over the tasks between them and execute efficient three-dimensional collaborative operations in time. This study has demonstrated integrating unmanned aerial/ground vehicles into a group through the control platform (i.e. App operation interface) that uses the Internet of Things. Therefore, pilots can make decisions and communicate through App for cooperative coordination, allowing a group of unmanned aerial/ground vehicles to complete the tasks flexibly. In addition, the payload attached to unmanned air/ground vehicles can carry out multipurpose monitoring that implements face recognition, gas detection, thermal imaging, and video recording. During the experiment of unmanned aerial vehicle, unmanned aerial vehicle will plan the flight path and record the movement trajectory with global positioning system when it is on duty. As a result, the accuracy of the planned flight path achieved 86.89% on average.
Even with visual contact equipment such as cameras, an unmanned ground vehicle (UGV) alone usually takes a lot of time to navigate an unfamiliar area with obstacles. Therefore, this study proposes a fast drone-aided path planning approach to help UGVs traverse an unfamiliar area with obstacles. In this scenario called UAV/UGV mobile collaboration (abbreviated UAGVMC), a UGV initially invokes an unmanned aerial vehicle (UAV) at the scene to take a ground image and send it back to the cloud to proceed with object detection, image recognition, and path planning (abbreviated odirpp). The cloud then sends the UGV a well-planned path map to help traverse an unfamiliar area. This approach uses the one-stage object detection and image recognition algorithm YOLOv4-CSP to quickly and accurately identify obstacles and the New Bidirectional A* (NBA*) algorithm to plan an optimal route avoiding ground objects. Experiments show that the execution time of path planning for each scene is less than 10 s on average. It does not affect the image quality of the path map. It ensures that the user can correctly interpret the path map and remotely drive the UGV rapidly, passing through that unfamiliar area with obstacles. As a result, the selected model can outperform the other alternatives significantly by average performance ratio up to 3.87 times on average.
In recent years, the integration of air vehicles and ground vehicles has been a technological development with huge potential. However, making both types of vehicle work together concurrently is a major challenge. Therefore, we have developed a novel cooperative communication protocol to efficiently implement the collaborative operation between an unmanned aerial vehicle (UAV) and an unmanned ground vehicle (UGV). Through position sensing and time sensing, the UAV and UGV can achieve autonomous control by sending commands to each other. For example, the UGV can instantly pass a sensing-related task such as gas detection or obtaining a view of the ground from the air to the UAV, and the user can track the trajectory of the UAV flight through position sensing and time sensing to determine whether the UAV has done a task. Similarly, the UAV can send commands to the UGV to carry out sensing-related tasks such as face recognition and infrared thermal imaging of people on the ground. The UAV and UGV are not only equipped with a gas sensor and camera sensor, but they also have the built-in functions of position sensing and time sensing. According to experiments on the collaborative operation between UAVs and UGVs, the proposed approach enables the group deployment of unmanned vehicles to execute successfully three-dimensional collaborative operations that can be operated via an app without a ground control station. As a result, the proposed cooperative communication protocol through position sensing and time sensing efficiently realizes the IoT-connected group deployment of unmanned vehicles with sensing units.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.