Due to hot toxic smoke and unknown risks under fire conditions, detection and relevant reconnaissance are significant in avoiding casualties. A fire reconnaissance robot was therefore developed to assist in the problem by offering important fire information to fire fighters. The robot consists of three main systems, a display operating system, video surveillance, and mapping and positioning navigation. Augmented reality (AR) goggle technology with a display operating system was also developed to free fire fighters’ hands, which enables them to focus on rescuing processes and not system operation. Considering smoke disturbance, a thermal imaging video surveillance system was included to extract information from the complicated fire conditions. Meanwhile, a simultaneous localization and mapping (SLAM) technology was adopted to build the map, together with the help of a mapping and positioning navigation system. This can provide a real-time map under the rapidly changing fire conditions to guide the fire fighters to the fire sources or the trapped occupants. Based on our experiments, it was found that all the tested system components work quite well under the fire conditions, while the video surveillance system produces clear images under dense smoke and a high-temperature environment; SLAM shows a high accuracy with an average error of less than 3.43%; the positioning accuracy error is 0.31 m; and the maximum error for the navigation system is 3.48%. The developed fire reconnaissance robot can provide a practically important platform to improve fire rescue efficiency to reduce the fire casualties of fire fighters.
Light detection and ranging (LiDAR) is one of the popular technologies to acquire critical information for building information modelling. To allow an automatic acquirement of building information, the first and most important step of LiDAR technology is to accurately determine the important gesture information that micro electromechanical (MEMS) based inertial measurement unit (IMU) sensors can provide from the moving robot. However, during the practical building mapping, serious errors may happen due to the inappropriate installation of a MEMS-IMU. Through this study, we analyzed the different systematic errors, such as biases, scale errors, and axial installation deviation, that happened during the building mapping, based on a robot equipped with MEMS-IMU. Based on this, an error calibration model was developed. The problems of the deviation between the calibrated and horizontal planes were solved by a new sampling method. For this method, the calibrated plane was rotated twice; the gravity acceleration of the six sides of the MEMS-IMU was also calibrated by the practical values, and the whole calibration process was completed after solving developed model based on the least-squares method. Finally, the building mapping was then calibrated based on the error calibration model, and also the Gmapping algorithm. It was indicated from the experiments that the proposed model is useful for the error calibration, which can increase the prediction accuracy of yaw by 1–2° based on MEMS-IMU; the mapping results are more accurate when compared to the previous methods. The research outcomes can provide a practical basis for the construction of the building information modelling model.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.