Purpose The purpose of this study is to propose a method for vulnerable pedestrians to visualize potential obstacles on sidewalks. In recent years, the number of vulnerable pedestrians has been increasing as Japanese society has aged. The number of wheelchair users is also expected to increase in the future. Currently, barrier-free maps and street-view applications can be used by wheelchair users to check possible routes and the surroundings of their destinations in advance. However, identifying physical barriers that pose a threat to vulnerable pedestrians en route is often difficult. Design/methodology/approach This study uses photogrammetry to create a digital twin of the three-dimensional (3D) geometry of the existing walking space by collecting photographic images taken on sidewalks. This approach allows for the creation of high-resolution digital elevation models of the entire physical sidewalk surface from which physical barriers such as local gradients and height differences can be detected by uniform image filtering. The method can be used with a Web-based data visualization tool in a geographical information system, permitting first-person views of the ground and accurate geolocation of the barriers on the map. Findings The findings of this study showed that capturing the road surface with a small wide-angle camera while walking is sufficient for recording subtle 3D undulations in the road surface. The method used for capturing data and the precision of the 3D restoration results are described. Originality/value The proposed approach demonstrates the significant benefits of creating a digital twin of walking space using photogrammetry as a cost-effective means of balancing the acquisition of 3D data that is sufficiently accurate to show the detailed geometric features needed to navigate a walking space safely. Further, the findings showed how information can be provided directly to users through two-dimensional (2D) and 3D Web-based visualizations.
In this paper, we investigate a blind watermarking algorithm based on a highpass filter for three-dimensional (3-D) meshes. For improving watermark detection in correlationbased watermarking, our scheme employs the highpass filter which emphasizes an impulse signal embedded as a signature into a host mesh. In the proposed method, we align the host mesh by the principal component analysis and convert from orthogonal coordinates to polar coordinates. After this preprocessing, we map the 3-D data onto a 2-D space via block segmentation and average operation, and rearrange for the 2-D data to an 1-D sequence. On the 1-D space, we apply a complex smear transform and a highpass filter. From the resulting signal, we derive the optimum complex-valued impulse signal in terms of the Euclidean norm. To generate a watermark with desirable properties, similar to a pseudonoise signal, we perform a complex desmear transform, which is the inverse system to the complex smear transform, on the complex-valued impulse signal. After reordering into the 2-D signal and 3-D mapping from the 2-D space, the watermark is embedded into the host mesh and the resulting mesh is converted to orthogonal coordinates. At the decoder, we implement an inverse process with the highpass filter for stego meshes and detect a position of the maximum value as a signature. For a 3-D Bunny model, detection rates are shown to evaluate the performance of the proposed algorithm.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.