2020
DOI: 10.1007/s10922-020-09545-w
|View full text |Cite
|
Sign up to set email alerts
|

Immersive Interconnected Virtual and Augmented Reality: A 5G and IoT Perspective

Abstract: In this article, we articulate the technical challenges to enable a future AR/VR end-to-end architecture, that combines 5G URLLC and Tactile IoT technology to support this next generation of interconnected AR/VR applications. Through the use of IoT sensors and actuators, AR/VR applications will be aware of the environmental and user context, supporting human-centric adaptations of the application logic, and lifelike interactions with the virtual environment. We present potential use cases and the required tech… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
13
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
3
2

Relationship

2
6

Authors

Journals

citations
Cited by 50 publications
(20 citation statements)
references
References 74 publications
0
13
0
Order By: Relevance
“…With the advancement of 5G technology, AR/VR requires significant resources for real-time processing, i.e., rendering, vision, and physics engines, at end user devices to maintain its ubiquitous nature across heterogeneous networks, including 4G, 5G, and WiFi. Usually, in mission-critical networks, the delivery of small information packets (32 to 200 bytes) within a latency of 1 ms [ 112 ] results in a huge computation delay with the traversal of each packet undergoing the procedure of processing, queuing, and transmitting through multiple routers, where each router contributes 10 ms of computation delay [ 14 ]. However, the adoption of high-end hardware and processing approaches such as the Multi-Path Transmission Control Protocol (MPTCP) [ 113 ] to reduce the computational complexity and transmission instability can only reduce the latency by 5–6 ms for time-sensitive AR/VR applications.…”
Section: Discussion and Future Scopementioning
confidence: 99%
See 1 more Smart Citation
“…With the advancement of 5G technology, AR/VR requires significant resources for real-time processing, i.e., rendering, vision, and physics engines, at end user devices to maintain its ubiquitous nature across heterogeneous networks, including 4G, 5G, and WiFi. Usually, in mission-critical networks, the delivery of small information packets (32 to 200 bytes) within a latency of 1 ms [ 112 ] results in a huge computation delay with the traversal of each packet undergoing the procedure of processing, queuing, and transmitting through multiple routers, where each router contributes 10 ms of computation delay [ 14 ]. However, the adoption of high-end hardware and processing approaches such as the Multi-Path Transmission Control Protocol (MPTCP) [ 113 ] to reduce the computational complexity and transmission instability can only reduce the latency by 5–6 ms for time-sensitive AR/VR applications.…”
Section: Discussion and Future Scopementioning
confidence: 99%
“…It is challenging to maintain a low MTP latency as VR terminals have to undergo a serial process of motion capture, logic computing, picture rendering, and screen display. Hence, despite the extensive advances in AR/VR technologies, MTP latency [ 14 ] acts as the greatest barrier by holding back AR/VR adoption from proving a fully immersive experience and offers limitations in the field of view due to occlusion, as well as poor display quality. A very minimal MTP latency of less than for head-mounted display (HMD) users is required to avoid having an unrealistic experience of the virtual world [ 15 ].…”
Section: Introductionmentioning
confidence: 99%
“…To prevent buffer starvation and ensure a smooth playback in these highly interactive applications, traditional HAS methods need to be augmented. HAS clients for VR applications rely on techniques for predicting the users’ target field of view or viewport, and rate adaptation heuristics to fine-tune the quality level of the requested content in response to the users’ movements and network conditions [ 7 , 16 , 32 ].…”
Section: Explora-vr : Approach Overviewmentioning
confidence: 99%
“…Research on this topic signals that the motion-to-photon (MTP) latency for VR displays should be less than 20 ms to prevent the perception of scene instability and cybersickness [ 5 , 6 ]. For on-demand tile-based 360 video streaming in particular, many of the existing studies have focused on mitigating the effect of latency by increasing viewport prediction accuracy and applying HTTP adaptive streaming (HAS) methods to adapt the quality of the requested content to the network conditions [ 7 , 8 ]. While these approaches achieve a rational use of the bandwidth as perceived at the client’s side, the network latency due to distant content servers can still substantially degrade the viewer experience.…”
Section: Introductionmentioning
confidence: 99%
“…Immersive media is expected to enable a plethora of opportunities to support interactive application domains, such as immersive training, immersive surgery, or multi-user interactive gaming [1]. For the sake of interactivity and quality, these application domains impose stringent requirements on the network in terms of high bandwidth (between 100 Gbps and 1 Tbps) and ultra-low latency (down to a Motion-To-Photon latency of 20 milliseconds [2])…”
Section: Introductionmentioning
confidence: 99%