The training process in industries is assisted with computer solutions to reduce costs. Normally, computer systems created to simulate assembly or machine manipulation are implemented with traditional Human-Computer interfaces (keyboard, mouse, etc). But, this usually leads to systems that are far from the real procedures, and thus not efficient in term of training. Two techniques could improve this procedure: mixedreality and haptic feedback. We propose in this paper to investigate the integration of both of them inside a single framework. We present the hardware used to design our training system. A feasibility study allows one to establish testing protocol. The results of these tests convince us that such system should not try to simulate realistically the interaction between real and virtual objects as if it was only real objects.
This paper reports the results of a PDM and CAD plug-in implementation for semiautomatic and real-time similar component search in mechanical field. The approach exploits a string based component description similar to the well-known methodology, called Group Technology (GT), in order to check interactively feature similarity over a PDM database. The GT code contains component geometric data and manufacturing information. The software developed is suitable for encoding 2D and 3D parts. A guided GUI returning the GT code has been implemented for 2D drafts. For 3D parts, instead, the encoding procedure is completely integrated in the modelling CAD interface and the code is calculated incrementally feature by feature. So the part similarity assessment is interactive: the designer may visualize similar parts stored in the PDM and decide whether changing singular feature or using a retrieved (similar) part. Several case studies described in the paper demonstrate GUI usage, search algorithm and results. With PDM correctly configured, results are very good since the GT coding, the part retrieval and the quoting are really interactive.
We present a system that exploits advanced Virtual Reality technologies to create a surveillance and security system. Surveillance cameras are carried by a mini Blimp which is tele-operated using an innovative Virtual Reality interface with haptic feedback. An interactive control room (CAVE) receives multiple video streams from airborne and fixed cameras. Eye tracking technology allows for turning the user's gaze into the main interaction mechanism; the user in charge can examine, zoom and select specific views by looking at them. Video streams selected at the control room can be redirected to agents equipped with a PDA. On-field agents can examine the video sent by the control center and locate the actual position of the airborne cameras in a GPS-driven map. The PDA interface reacts to the user's gestures. A tilt sensor recognizes the position in which the PDA is held and adapts the interface accordingly. The prototype we present shows the added value of integrating VR technologies into a complex application and opens up several research directions in the areas of tele-operation, Multimodal Interfaces, etc.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.