We present a wireless-inertial-measurement-unit- (WIMU-) based hand motion analysis technique for handwriting recognition in three-dimensional (3D) space. The proposed handwriting recognition system is not bounded by any limitations or constraints; users have the freedom and flexibility to write characters in free space. It uses hand motion analysis to segment hand motion data from a WIMU device that incorporates magnetic, angular rate, and gravity sensors (MARG) and a sensor fusion algorithm to automatically distinguish segments that represent handwriting from nonhandwriting data in continuous hand motion data. Dynamic time warping (DTW) recognition algorithm is used to recognize handwriting in real-time. We demonstrate that a user can freely write in air using an intuitive WIMU as an input and hand motion analysis device to recognize the handwriting in 3D space. The experimental results for recognizing handwriting in free space show that the proposed method is effective and efficient for other natural interaction techniques, such as in computer games and real-time hand gesture recognition applications.
In this paper, we present an inertial sensor-based touch and shake metaphor for expressive control of a 3D virtual avatar in a virtual environment. An intuitive six degrees-of-freedom wireless inertial motion sensor is used as a gesture and motion control input device with a sensor fusion algorithm. The algorithm enables user hand motions to be tracked in 3D space via magnetic, angular rate, and gravity sensors. A quaternion-based complementary filter is implemented to reduce noise and drift. An algorithm based on dynamic time-warping is developed for efficient recognition of dynamic hand gestures with real-time automatic hand gesture segmentation. Our approach enables the recognition of gestures and estimates gesture variations for continuous interaction. We demonstrate the gesture expressivity using an interactive flexible gesture mapping interface for authoring and controlling a 3D virtual avatar and its motion by tracking user dynamic hand gestures. This synthesizes stylistic variations in a 3D virtual avatar, producing motions that are not present in the motion database using hand gesture sequences from a single inertial motion sensor.
A spatial modelling system with a new interface which is optimized for automotive design is proposed in the conceptual modelling phase. In the initial sketch stage to set the volume and the ratio of the car, the automotive styling is created using a two-dimensional plane sketch with an accurate input and a designer-friendly interface. The character lines, which have important implications in the design of the car, are drawn with a curve-based oversketch. The shape of the car is modified using area-based modification. A completed two-dimensional model consists of converting the character lines to a three-dimensional car model. In the three-dimensional spatial oversketch stage, the edge and mesh on the surface are deformed by viewing the three-dimensional shape of the vehicle with various views using a spatial input device and stereoscopic display. Edge-based deformation modifies single or multiple edges while maintaining continuity between the surfaces. A sculpting technique is applied to execute special shapes such as the accent lines and the strake lines. A sketch system which can be used by designers to perform three-dimensional spatial modelling is proposed.
We present a low cost battery-powered 6-degree-of-freedom wireless wand for 3D modeling in free space by tri-axis Magnetic, Angular Rate, Gravity (MARG) and vision sensor fusion. Our approach has two stages of sensor fusion, each with different algorithms for finding 3D orientation and position. The first stage fusion algorithm, a complementary filter, utilizes MARG sensors to compute 3D orientation relative to the direction of gravity and earth's magnetic field in a quaternion format, which was adjusted with compensations for magnetic distortion. The second stage fusion algorithm, a Kalman filter, utilizes accelerometer data and IR marker velocity to compute 3D position. In order to compute the IR marker linear velocity along the optical axis (the z-axis), we present a simple and efficient image-based technique to find the distance of the object from the camera using blob area pixels in the image. Our fusion (inside-in and outside-in) approach efficiently solves short time occlusion, needs of frequent calibration, and unbounded drift problems involved in numerical integration of inertial sensors data and improves the degrees of freedom at low cost without compromising accuracy. The results are compared with a leading commercial magnetic motion tracking system to demonstrate the performance of the wand.
The number of scan-to-BIM projects that convert scanned data into Building Information Modeling (BIM) for facility management applications in the Mechanical, Electrical and Plumbing (MEP) fields has been increasing. This conversion features an application purpose-oriented process, so the Scan-to-BIM work parameters to be applied vary in each project. Inevitably, a modeler manually adjusts the BIM modeling parameters according to the application purpose, and repeats the Scan-to-BIM process until the desired result is achieved. This repetitive manual process has adverse consequences for project productivity and quality. If the Scan-to-BIM process can be formalized based on predefined rules, the repetitive process in various cases can be automated by re-adjusting only the parameters. In addition, the predefined rule-based Scan-to-BIM pipeline can be stored and reused as a library. This study proposes a rule-based Scan-to-BIM Mapping Pipeline to support application-oriented Scan-to-BIM process automation, variability and reusability. The application target of the proposed pipeline method is the plumbing system that occupies a large number of MEPs. The proposed method was implemented using an automatic generation algorithm, and its effectiveness was verified.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.