2017 Ninth International Conference on Quality of Multimedia Experience (QoMEX) 2017
DOI: 10.1109/qomex.2017.7965684
|View full text |Cite
|
Sign up to set email alerts
|

Quantitative evaluation of omnidirectional video quality

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
14
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
3
3

Relationship

2
4

Authors

Journals

citations
Cited by 7 publications
(14 citation statements)
references
References 7 publications
0
14
0
Order By: Relevance
“…Planar metrics. None of the metrics proposed in literature [5][6][7][8][9][10][11] addresses the detection of localized distortions in 360-degree content due to lossy compression in CM domain. Therefore, the performance of the proposed method is compared to that of a set of classical FR quality metrics for 2D images, i.e., planar metrics.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…Planar metrics. None of the metrics proposed in literature [5][6][7][8][9][10][11] addresses the detection of localized distortions in 360-degree content due to lossy compression in CM domain. Therefore, the performance of the proposed method is compared to that of a set of classical FR quality metrics for 2D images, i.e., planar metrics.…”
Section: Methodsmentioning
confidence: 99%
“…Results. All metrics were computed on a vertically subsampled version of the cube faces, corresponding to a face resolution of 1138x569 pixels 6 . Table 1 reports the detection accuracy achieved by each metric, using the different classification techniques, with L = 4 and L = 8 7 .…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The proposed approach is based on extracting spatio-temporal quality features (i.e., computing objective IQA metrics) from viewports, temporally pooling them taking the characteristics of the human-visual system (HVS) into consideration, and then training a random forest regression model to predict the 360-degree video quality. On the one hand, working with viewports allow us to better account for the final viewed content and naturally supports different projections [6,7]. On the other hand, the use of multiple objec-tive metrics computed on these viewports allow our method to have a good performance for the complex and diversifying nature of visual distortions appearing in 360-degree videos.…”
Section: Introductionmentioning
confidence: 94%
“…It is shown that most test subjects are focused on the front region of the video while watching the ODVs, which is an expected result. Birkbeck et al in [17] analyse the perceived ODV quality in relation to the type of video projection (Equirectangular, Cubemap, and Equi-Angular Cubemap projection), showing how the analysis of ODV quality and user experience requires devoting more time to additional research activities since the quality of 2D videos can be evaluated without focusing on these aspects. The relationship between ODV projection and the user experience is also analysed in [18,19].…”
Section: Related Workmentioning
confidence: 99%