2017
DOI: 10.1109/tcsvt.2016.2580425
|View full text |Cite
|
Sign up to set email alerts
|

A Mixed Reality Telepresence System for Collaborative Space Operation

Abstract: This paper presents a Mixed Reality system that results from the integration of a telepresence system and an application to improve collaborative space exploration. The system combines free viewpoint video with immersive projection technology to support non-verbal communication, including eye gaze, inter-personal distance and facial expression. Importantly, these can be interpreted together as people move around the simulation, maintaining natural social distance. The application is a simulation of Mars, withi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
33
0
1

Year Published

2017
2017
2022
2022

Publication Types

Select...
6
2
1

Relationship

2
7

Authors

Journals

citations
Cited by 62 publications
(34 citation statements)
references
References 41 publications
0
33
0
1
Order By: Relevance
“…The idea here was to reproduce all the communication cues (audio, visual, body expressions, facial expressions and gestures) that we enjoy in face-to-face meetings. The interested reader would be able to find a detailed description of the telepresence aspects of this project (physical setup, algorithms and evaluation) in [11], as the focus of the current paper is on the software architecture supporting the whole system. Figure 5 shows a 3D reconstructed user waving at two collaborators, one local and other remote (represented by a traditional avatar).…”
Section: Communication Viewmentioning
confidence: 99%
“…The idea here was to reproduce all the communication cues (audio, visual, body expressions, facial expressions and gestures) that we enjoy in face-to-face meetings. The interested reader would be able to find a detailed description of the telepresence aspects of this project (physical setup, algorithms and evaluation) in [11], as the focus of the current paper is on the software architecture supporting the whole system. Figure 5 shows a 3D reconstructed user waving at two collaborators, one local and other remote (represented by a traditional avatar).…”
Section: Communication Viewmentioning
confidence: 99%
“…2013). It was developed onward for a mixed reality system with multiple sites collaborating on shared data (Fairchild et al, 2016).…”
Section: Literature Reviewmentioning
confidence: 99%
“…With chairs evenly distributed around the Telethrone, it should in principle also be possible to support mutual awareness between a telepresent seated person and a person walking past. withyou (Roberts et al, 2015, Fairchild 2016) was a telepresence system that recreates a 3D CGI copy of a remote person that can be viewed from any perspective but was derived from multiple discrete 2D video streams. Before integration of withyou, the mona lisa effect meant that a pulled chair would need to exactly align to a remote camera if mutual gaze were to be supported.…”
Section: Introductionmentioning
confidence: 99%
“…A particular challenge, therefore, is to find a suitable coupling between the acquisition and viewing stages that respects the practical limitations imposed by available network bandwidth and client-side compute hardware while still guaranteeing an immersive exploration experience. For this purpose, teleconferencing systems for transmit-ting dynamic 3D models of their users typically rely on massive wellcalibrated acquisition setups with several statically mounted cameras around the region of interest [9,41,50]. Instead, we direct our attention to the remote exploration of places using portable, consumer-grade acquisition devices, for instance in scenarios of remote inspection or consulting.…”
Section: Introductionmentioning
confidence: 99%