Improving the Efficiency of QoE Crowdtesting
Autor: | Ricky K. P. Mok, Ginga Kawaguti, Jun Okamoto |
---|---|
Rok vydání: | 2020 |
Předmět: |
Multimedia
business.industry Computer science 05 social sciences 050801 communication & media studies 02 engineering and technology Crowdsourced testing computer.software_genre Crowdsourcing 0508 media and communications 0202 electrical engineering electronic engineering information engineering 020201 artificial intelligence & image processing Quality of experience Video streaming InformationSystems_MISCELLANEOUS business computer |
Zdroj: | QoEVMA @ ACM Multimedia |
Popis: | Crowdsourced testing is an increasingly popular way to study the quality of experience (QoE) of applications, such as video streaming and web. The diverse nature of the crowd provides a more realistic assessment environment than laboratory-based assessments allow. Because of the short life-span of crowdsourcing tasks, each subject spends a significant fraction of the experiment time just learning how it works. We propose a novel experiment design to conduct a longitudinal crowdsourcing study aimed at improving the efficiency of crowdsourced QoE assessments. On Amazon Mechanical Turk, we found that our design was 20% more cost-effective than crowdsourcing multiple one-off short experiments. Our results showed that subjects had a high level of revisit intent and continuously participated in our experiments. We replicated the video streaming QoE assessments in a traditional laboratory setting. Our study showed similar trends in the relationship between video bitrate and QoE, which confirm findings in prior research. |
Databáze: | OpenAIRE |
Externí odkaz: |