ReuseUnless indicated otherwise, fulltext items are protected by copyright with all rights reserved. The copyright exception in section 29 of the Copyright, Designs and Patents Act 1988 allows the making of a single copy solely for the purpose of non-commercial research or private study within the limits of fair dealing. The publisher or other rights-holder may allow further reproduction and re-use of this version -refer to the White Rose Research Online record for this item. Where records identify the publisher as the copyright holder, users can verify any specific terms of use on the publisher's website.
TakedownIf you consider content in White Rose Research Online to be in breach of UK law, please notify us by emailing eprints@whiterose.ac.uk including the URL of the record and the reason for the withdrawal request.1530-437X (c) 2016 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information. This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Abstract-The idea of being present in a remote location has inspired researchers to develop robotic devices that make humans to experience the feeling of telepresence. These devices need of multiple sensory feedback to provide a more realistic telepresence experience. In this work, we develop a wearable interface for immersion and telepresence that provides to human with the capability of both to receive multisensory feedback from vision, touch and audio and to remotely control a robot platform. Multimodal feedback from a remote environment is based on the integration of sensor technologies coupled to the sensory system of the robot platform. Remote control of the robot is achieved by a modularised architecture, which allows to visually explore the remote environment. We validated our work with multiple experiments where participants, located at different venues, were able to successfully control the robot platform while visually exploring, touching and listening a remote environment. In our experiments we used two different robotic platforms: the iCub humanoid robot and the Pioneer LX mobile robot. These experiments show that our wearable interface is comfortable, easy to use and adaptable to different robotic platforms. Furthermore, we observed that our approach allows humans to experience a vivid feeling of being present in a remote environment.