TY - GEN
T1 - Visual quality assessment for motion compensated frame interpolation
AU - Men, Hui
AU - Lin, Hanhe
AU - Hosu, Vlad
AU - Maurer, Daniel
AU - Bruhn, Andés
AU - Saupe, Dietmar
N1 - Funding Information:
ACKNOWLEDGMENT Funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) – Projektnummer 251654672 – TRR 161 (Project A05 and B04).
Funding Information:
Funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) Projektnummer 251654672 TRR 161 (Project A05 and B04).
Publisher Copyright:
© 2019 IEEE.
PY - 2019/6/24
Y1 - 2019/6/24
N2 - Current benchmarks for optical flow algorithms evaluate the estimation quality by comparing their predicted flow field with the ground truth, and additionally may compare interpolated frames, based on these predictions, with the correct frames from the actual image sequences. For the latter comparisons, objective measures such as mean square errors are applied. However, for applications like image interpolation, the expected user's quality of experience cannot be fully deduced from such simple quality measures. Therefore, we conducted a subjective quality assessment study by crowdsourcing for the interpolated images provided in one of the optical flow benchmarks, the Middlebury benchmark. We used paired comparisons with forced choice and reconstructed absolute quality scale values according to Thurstone's model using the classical least squares method. The results give rise to a re-ranking of 141 participating algorithms w.r.t. visual quality of interpolated frames mostly based on optical flow estimation. Our re-ranking result shows the necessity of visual quality assessment as another evaluation metric for optical flow and frame interpolation benchmarks.
AB - Current benchmarks for optical flow algorithms evaluate the estimation quality by comparing their predicted flow field with the ground truth, and additionally may compare interpolated frames, based on these predictions, with the correct frames from the actual image sequences. For the latter comparisons, objective measures such as mean square errors are applied. However, for applications like image interpolation, the expected user's quality of experience cannot be fully deduced from such simple quality measures. Therefore, we conducted a subjective quality assessment study by crowdsourcing for the interpolated images provided in one of the optical flow benchmarks, the Middlebury benchmark. We used paired comparisons with forced choice and reconstructed absolute quality scale values according to Thurstone's model using the classical least squares method. The results give rise to a re-ranking of 141 participating algorithms w.r.t. visual quality of interpolated frames mostly based on optical flow estimation. Our re-ranking result shows the necessity of visual quality assessment as another evaluation metric for optical flow and frame interpolation benchmarks.
KW - Frame interpolation
KW - Optical flow
KW - Visual quality assessment
UR - http://www.scopus.com/inward/record.url?scp=85068694927&partnerID=8YFLogxK
U2 - 10.1109/QoMEX.2019.8743221
DO - 10.1109/QoMEX.2019.8743221
M3 - Conference contribution
AN - SCOPUS:85068694927
SN - 978-1-5386-8213-5
T3 - 2019 11th International Conference on Quality of Multimedia Experience, QoMEX 2019
BT - 2019 11th International Conference on Quality of Multimedia Experience (QoMEX 2019)
PB - IEEE
T2 - 11th International Conference on Quality of Multimedia Experience
Y2 - 5 June 2019 through 7 June 2019
ER -