Empirical evaluation of no-reference VQA methods on a natural video quality database

Hui Men, Hanhe Lin, Dietmar Saupe

Research output: Chapter in Book/Report/Conference proceedingConference contribution

18 Citations (Scopus)

Abstract

No-Reference (NR) Video Quality Assessment (VQA) is a challenging task since it predicts the visual quality of a video sequence without comparison to some original reference video. Several NR-VQA methods have been proposed. However, all of them were designed and tested on databases with artificially distorted videos. Therefore, it remained an open question how well these NR-VQA methods perform for natural videos. We evaluated two popular VQA methods on our newly built natural VQA database KoNViD-1k. In addition, we found that merely combining five simple VQA-related features, i.e., contrast, colorfulness, blurriness, spatial information, and temporal information, already gave a performance about as well as those of the established NR-VQA methods. However, for all methods we found that they are unsatisfying when assessing natural videos (correlation coefficients below 0.6). These findings show that NR-VQA is not yet matured and in need of further substantial improvement.

Original languageEnglish
Title of host publication2017 9th International Conference on Quality of Multimedia Experience (QoMEX 2017)
PublisherIEEE
Number of pages3
ISBN (Electronic)978-1-5386-4024-1
ISBN (Print)978-1-5386-4025-8
DOIs
Publication statusPublished - 3 Jul 2017
Event9th International Conference on Quality of Multimedia Experience, QoMEX 2017 - Erfurt, Germany
Duration: 29 May 20172 Jun 2017

Publication series

Name2017 9th International Conference on Quality of Multimedia Experience, QoMEX 2017
PublisherIEEE
ISSN (Electronic)2472-7814

Conference

Conference9th International Conference on Quality of Multimedia Experience, QoMEX 2017
Country/TerritoryGermany
CityErfurt
Period29/05/172/06/17

Keywords

  • empirical evaluation
  • feature combination
  • no-reference
  • video quality assessment

Cite this