Conventional or electronic sticks are one of the tools that are often used by people with blind persons for independent activities. Both tools have limitations that cannot provide a sense of comfort in the form of information on the distance position between blind persons. In this research, electronic hardware is designed and built, which is integrated with conventional sticks. Designs and aids are made to exchange positional information on the distance between blind persons for normal and emergency conditions. The two electronic hardware consists of a global position system (GPS) modules, LoRa radio, DF mini-player, liquid crystal display (LCD), button, and a microcontroller. Communication between electronic hardware uses a point to point topology with half-duplex communication. Each hardware device is planted with source ID and destination ID, so that as soon as the push-button is pressed, the microcontroller reads the coordinate point data from the GPS module. The combined coordinate data and source ID are transmitted to the destination ID via LoRa radio and will exchange coordinate data instead. Furthermore, the two coordinate data in each microcontroller hardware is converted into distance information (meters) using the Haversine algorithm. Distance information is displayed on the LCD and converted into sound output through the DF mini-player speaker. The result of system functionality testing, the average percentage error is 1.21%, and the reading of all the resulting distance data can sound. Testing the GPS module obtained the percentage of conformity of the coordinates with an accuracy of 99%. Testing the data communication system obtained the maximum distance on the LoRa module as far as 500 meters. The received signal strength value (RSSI) parameter has no effect on the distance tested, and the receiver's packet loss parameter will be very influential if it is blocked by an object (obstacle).
Tools for blind people with mobility activities in pedestrian pathways have been widely launched, approved and patented. However, there are still shortcomings that can be done only for pedestrian paths or nearby destinations. In this study, both a camera (detection of the pedestrian path) and LiDAR (detection of surrounding objects) sensors to help disability activities. The first stage of image data from the preparatory camera from RGB to XYZ, color filters, close morphology, resizing, learning and testing of neural networks. Bring up 3 voice attitudes information. Attitudes are perpendicular, left tilted, right tilted, or not reversed to the pedestrian yellow path. The second stage of the LiDAR distance points data is processed into 2D array geometry, learning, and testing of neural networks. Bring up the information 8 voice attitudes. Detection of the cycle and distance of objects right side, front, left, right-front, right-left, front-left, right-front-left, not captured. The test results approximately at lux <15000 got 89.7% accuracy for pedestrian path detection and 87.5% for object detection.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.