Cross‐virtuality analytics (XVA) is a novel field of research within immersive analytics and visual analytics. A broad range of heterogeneous devices across the reality–virtuality continuum, along with respective visual metaphors and analysis techniques, are currently becoming available. The goal of XVA is to enable visual analytics that use transitional and collaborative interfaces to seamlessly integrate different devices and support multiple users. In this work, we take a closer look at XVA and analyse the existing body of work for an overview of its current state. We classify the related literature regarding ways of establishing cross‐virtuality by interconnecting different stages in the reality–virtuality continuum, as well as techniques for transitioning and collaborating between the different stages. We provide insights into visualization and interaction techniques employed in current XVA systems. We report on ways of evaluating such systems, and analyse the domains where such systems are becoming available. Finally, we discuss open challenges in XVA, giving directions for future research.
When analysing data collected from non-destructive testing, several areas of expertise are required, such as on imaging modalities, data analysis and visualisation, or the domain background in terms of the inspected material. Typically, this requires a team of experts, who are not necessarily situated at the same working place. We therefore propose a system supporting collaborative remote analysis facilitating easy access to a common workspace. The system supports a collaborative analysis of datasets by connecting interactions in a lightweight web-based viewer with those in an advanced analysis software package on a desktop PC. Our system supports both co-located and distributed collaboration on large complex datasets and allows for a simple spontaneous cross-device access. This allows a wide range of devices and immersive systems to be supported regardless of their capabilities. Viewpoints and annotations can be shared as visual cues for the analysis of collaborators. The proposed system will be presented using a case study of analysing an adaptive manufactured aeronautics component. The preliminary findings showed positive feedback of the experts regarding the creation of a flexible and fast collaborative analysis set-up.
Modern video-based head-mounted displays allow users to operate along Milgram’s entire reality-virtuality continuum. This opens up the field for novel cross-reality applications that distribute data analytics tasks along this continuum to combine benefits of established 2D information visualisation in the real environment with immersive analytics. In this publication, we explore this potential by transforming 2D graph data from a planar, large-scale display in the real environment into a spherical layout in augmented reality 3D space, letting it appear as if the graph is moving out of the display. We focus on design aspects of this transformation that potentially help users to form a joint mental model of both visualisations and to continue their tasks seamlessly in augmented reality. For this purpose, we implemented a framework of transformation parameters that can be categorised as follows: transformation methods, node transformation order (groupings) and different ways of visual interconnection. Variants in each of these areas were investigated in three quantitative user studies in which users had to solve a simple cluster search task. We confirmed that a visual transformation from 2D to 3D helps users to continue their tasks in augmented reality with less interruptions, and that node transformation order should be adjusted to data and task context. We further identified that users can perform tasks more efficiently when a user-controlled transformation is used, while a constant transformation with fixed duration can contribute to lower error rates.
No abstract
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.