5G is a key tool for the cloud-based wireless production of audio-visual contents. By providing higher throughput and lower latency than previous mobile network technologies, as well as flexible allocation of network resources, it enables that some content production pipelines, which are currently implemented using wired connectivity, are provided on top of a 5G network. This brings down production costs, reduces environmental impact and increases operational efficiency. In particular, live immersive production services, such as free-viewpoint video (FVV), can be provided in a much more efficient way. In this paper, we address the challenge of producing and streaming FVV services in real time through 5G networks. We have adapted a state-of-the-art FVV system to integrate it within a 5G architecture, using three key technological enablers: mmWave radio access network to support the uplink traffic requirements from FVV cameras, multi-access edge computing to run video processing algorithms with minimum latency, and end-to-end slicing to guarantee a sufficient quality of service (QoS) within the production pipeline. We have built a field trial over the production network of a telecommunication operator, including a mmWave pilot deployment, edge cloud processing, and remote content production, involving three different locations across Spain. We have measured the key performance indicators at the relevant parts of our trial deployment, showing that, with existing 5G technology, it is possible to achieve live FVV production, although with some limitations. We have also analyzed such limitations, obtaining some insights on how the next generation of 5G networks can overcome them to achieve higher quality of experience (QoE).
and the implications of the human actions in the environment (i.e. climate change [7], ocean acidification [8] or fishing exploitation [9]), have led to the development of new technologies and strategies. These have pushed the boundaries of traditional tracking methods such as presence/absence detections and long baseline (LBL) systems [10].In addition, other studies have focused on the development of new tags, which have been specifically designed to accomplish challenging tasks. For example, in [11] the authors proposed a flexible and stretchable skin-like tag, or in [12] where a soft-bodied invertebrate eco-sensor tag is presented. Nonetheless, these tags work as dataloggers, and must be recovered to download the information, or the animal must reach the surface to have access to it through land-based wireless communications (i.e. Bluetooth or satellite). Others, such as [13], [14] have focused on piezoelectric transducers design to maximise acoustic tag performance.At present, all the acoustic tags have a unidirectional communication protocol (i.e. the tag transmits an acoustic signal, which is recorded by a receiver, but cannot receive any signal by an external device). This characteristic introduces important limitations such as: (i) the impossibility to configure the tag after the deployment; (ii) the difficulty to compute the distance between the tag and the receiver (i.e. the time of flight (TOF)), and therefore, range-based target tracking methods are not possible [15], [16]; and (iii), the limitation of tag intercommunication, which could difficult the implementation of acoustic underwater networks, and use the tagged species as mobile nodes.To improve the current state-of-the-art of electronic tags, we propose a bidirectional tag device which will allow more accurate studies and will open a new wide tracking capability using autonomous underwater vehicles and range-based algorithms. Moreover, thanks to the embedded microprocessor, the tag could also be used to create an underwater wireless sensor network (UWSN) [17], enabling IoT applications and swarm concepts [18].
FVV Live is a novel real-time, low-latency, end-to-end free viewpoint system including capture, transmission, synthesis on an edge server and visualization and control on a mobile terminal. The system has been specially designed for low-cost and real-time operation, only using off-the-shelf components.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.