2015 Seventh International Workshop on Quality of Multimedia Experience (QoMEX) 2015
DOI: 10.1109/qomex.2015.7148128
|View full text |Cite
|
Sign up to set email alerts
|

Cognitive no-reference video quality assessment for mobile streaming services

Abstract: Abstract-The evaluation of mobile streaming services, particularly in terms of delivered Quality of Experience (QoE), entails the use of automated methods (which excludes subjective QoE) that can be executed in real-time (i.e. without delaying the streaming process). This calls for lightweight algorithms that provide accurate results under considerable constraints. Starting from a low complexity no-reference objective algorithm for still images, in this work we contribute a new version that not only works for … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
9
0

Year Published

2016
2016
2019
2019

Publication Types

Select...
4
2
1

Relationship

4
3

Authors

Journals

citations
Cited by 12 publications
(9 citation statements)
references
References 23 publications
0
9
0
Order By: Relevance
“…In addition to this set, we added a feature concerning the temporal characteristics of the video on the pixel level, the motion intensity, which measures the movement of video objects between frames by means of the compared level of intensity [40]. The reasoning behind selecting these features and not others comes from the need to pursue low computation and ability to be performed in real-time even on light-weight devices such as smartphones and tablets, as we demonstrated in previous work [41], [23].…”
Section: Real-time Cognitive Video Quality Assessment Methodsmentioning
confidence: 99%
“…In addition to this set, we added a feature concerning the temporal characteristics of the video on the pixel level, the motion intensity, which measures the movement of video objects between frames by means of the compared level of intensity [40]. The reasoning behind selecting these features and not others comes from the need to pursue low computation and ability to be performed in real-time even on light-weight devices such as smartphones and tablets, as we demonstrated in previous work [41], [23].…”
Section: Real-time Cognitive Video Quality Assessment Methodsmentioning
confidence: 99%
“…In [61] we showed the use of Reinforcement Learning to optimize video quality in adaptive streaming, without 600 using complex heuristics. In [1] we showed how artificial neural networks could determine a linear combination of blur and noise that performed significantly better than these two NR metrics in isolation. Finally, our recent survey of machine learning in NR video quality assessment [62] provides a snapshot of the state-of-the-art on which our work is based.…”
Section: Related Work On Machine Learning For Nr Quality Assessmentmentioning
confidence: 99%
“…Low complexity No-Reference (NR) video quality methods have the potential to provide real-time video quality assessment and automated quality control, for instance in the context of video streaming on demand [1], peer to peer services [2,3] or real-time network management [4,5]. This is because simple Due to their particular methodology, computational requirements and functional limitations, neither FR methods nor subjective evaluations are viable 10 to automate quality control processes, whereby both scalability and speed are required.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Pandremmenou et al [24] employed the Least Absolute Shrinkage and Selection Operator (LASSO) regression method for assessing the accuracy of bitstream parameters to FR metrics and subjective analysis in videos affected by compression and synthetic impairments. In our previous research we developed a lightweight algorithm combining bitstream parameters (video bitrate, complexity and motion) with pixel artifacts (blur and noise) [36]. We presented the machine learning-based algorithm showing high correlation with SSIM.…”
Section: Introductionmentioning
confidence: 99%