ISCA Archive PQS 2016
ISCA Archive PQS 2016

Size does matter. Comparing the results of a lab and a crowdsourcing file download QoE study

Andreas Sackl, Bruno Gardlo, Raimund Schatz

Over the last couple of years, crowdsourcing has become a widely used method for conducting subjective QoE experiments over the Internet. However, the scope of crowdsourced QoE experiments so far has been mostly limited to video and image quality testing, despite the existence of many other relevant application categories. In this paper we demonstrate the applicability of crowdsourced QoE testing to the case of file downloads. We conducted several campaigns in which participants had to download large (10-50MB) media files (with defined waiting times) and subsequently rate their QoE. The results are compared with those of a lab-based file download QoE study featuring an equivalent design. Our results show that crowdsourced QoE testing can also be applied to file downloads with a size of 10 MB as rating results are very similar to the lab. However, beyond user reliability checks and filtering, we found the study design to be a highly critical element as it exerted strong influence on overall participant behavior. For this reason we also present a discussion of valuable lessons learned in terms of test design and participant behavior.