Comparison of crowdsourcing and laboratory settings for subjective assessment of video quality and acceptability & annoyance
Résumé
User satisfaction is significantly influenced by their expectations of video quality. Even when users are presented with identical video stimuli, the Quality of Experience (QoE) can vary based on the context. The acceptability and annoyance paradigm serves as a tool to understand this relationship by measuring QoE as a function of user expectations and video quality. Traditionally, subjective experiments assessing QoE have been conducted in controlled laboratory settings. While the extension of traditional video quality experiments to crowdsourcing settings is well-explored, the impact of crowdsourcing on QoE studies has not been thoroughly examined. This study explore the potential use of crowdsourcing platforms for acceptability & annoyance experiments. To this end, video quality and acceptability & annoyance experiments were conducted in both laboratory and crowdsourcing settings. The findings reveal a more linear relationship between video quality and QoE in crowdsourcing settings. Subjects in crowdsourcing settings tend to have higher expectations of video quality, resulting in a slight increase in acceptability & annoyance thresholds compared to laboratory experiments. Analyses suggest that extending acceptability & annoyance experiments to crowdsourcing is not as straightforward as extending traditional video quality experiments. In crowdsourcing settings, priming subject expectations with instructions is not as effective as it is in laboratory conditions.
Origine | Fichiers produits par l'(les) auteur(s) |
---|