2017 Ninth International Conference on Quality of Multimedia Experience (QoMEX) 2017
DOI: 10.1109/qomex.2017.7965659
|View full text |Cite
|
Sign up to set email alerts
|

Which saliency weighting for omni directional image quality assessment?

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
40
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 72 publications
(41 citation statements)
references
References 8 publications
1
40
0
Order By: Relevance
“…Most recently, some omnidirectional image/video datasets have been established to collect the HM data [3,44,47] and EM data [26] of subjects. Given these datasets, it is possible to analyze and model human behavior [27,47] on viewing omnidirectional image/video. Additionally, some works [1,5,32] have been recently proposed to predict human's HM and EM on omnidirectional video, similar to saliency prediction on 2D video.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Most recently, some omnidirectional image/video datasets have been established to collect the HM data [3,44,47] and EM data [26] of subjects. Given these datasets, it is possible to analyze and model human behavior [27,47] on viewing omnidirectional image/video. Additionally, some works [1,5,32] have been recently proposed to predict human's HM and EM on omnidirectional video, similar to saliency prediction on 2D video.…”
Section: Related Workmentioning
confidence: 99%
“…Hence, the human behavior of HM and EM is rather important in determining visual quality of omnidirectional video. In fact, there are many latest works [1,3,5,26,27,32,44] concerning the human behavior of HM and EM in watching omnidirectional image/video. Along with these approaches, some omnidirectional image/video datasets were established to collect HM or EM data.…”
Section: Introductionmentioning
confidence: 99%
“…47 Moreover, when shown in equirectangular projection, the 360°contents are non-Euclidean. 48 To address these problems, many studies on 360°images/videos in viewing databases, 49,50 image quality assessment, 51 and saliency detection, 52 have been conducted in recent years.…”
Section: Related Saliency Modelsmentioning
confidence: 99%
“…As a future extension of our work, more accurate fixations maps can be computed based on the prediction of the eye gaze fixation from head fixation locations, applying the work of Rai et al 37 for instance. In addition, the selection of the threshold indicating if the angular velocity of observer's head prevents subjects focus of attention should be optimized.…”
Section: Toggling Processingmentioning
confidence: 99%